[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15500 1727096199.56706: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15500 1727096199.57010: Added group all to inventory 15500 1727096199.57012: Added group ungrouped to inventory 15500 1727096199.57015: Group all now contains ungrouped 15500 1727096199.57017: Examining possible inventory source: /tmp/network-EuO/inventory.yml 15500 1727096199.66119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15500 1727096199.66161: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15500 1727096199.66180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15500 1727096199.66218: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15500 1727096199.66268: Loaded config def from plugin (inventory/script) 15500 1727096199.66270: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15500 1727096199.66298: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15500 1727096199.66352: Loaded config def from plugin (inventory/yaml) 15500 1727096199.66354: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15500 1727096199.66416: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15500 1727096199.66692: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15500 1727096199.66695: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15500 1727096199.66697: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15500 1727096199.66701: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15500 1727096199.66706: Loading data from /tmp/network-EuO/inventory.yml 15500 1727096199.66748: /tmp/network-EuO/inventory.yml was not parsable by auto 15500 1727096199.66793: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15500 1727096199.66823: Loading data from /tmp/network-EuO/inventory.yml 15500 1727096199.66878: group all already in inventory 15500 1727096199.66883: set inventory_file for managed_node1 15500 1727096199.66886: set inventory_dir for managed_node1 15500 1727096199.66887: Added host managed_node1 to inventory 15500 1727096199.66889: Added host managed_node1 to group all 15500 1727096199.66889: set ansible_host for managed_node1 15500 1727096199.66890: set ansible_ssh_extra_args for managed_node1 15500 1727096199.66892: set inventory_file for managed_node2 15500 1727096199.66894: set inventory_dir for managed_node2 15500 1727096199.66894: Added host managed_node2 to inventory 15500 1727096199.66895: Added host managed_node2 to group all 15500 1727096199.66895: set ansible_host for managed_node2 15500 1727096199.66896: set ansible_ssh_extra_args for managed_node2 15500 1727096199.66898: set inventory_file for managed_node3 15500 1727096199.66899: set inventory_dir for managed_node3 15500 1727096199.66899: Added host managed_node3 to inventory 15500 1727096199.66900: Added host managed_node3 to group all 15500 1727096199.66901: set ansible_host for managed_node3 15500 1727096199.66901: set ansible_ssh_extra_args for managed_node3 15500 1727096199.66903: Reconcile groups and hosts in inventory. 15500 1727096199.66905: Group ungrouped now contains managed_node1 15500 1727096199.66906: Group ungrouped now contains managed_node2 15500 1727096199.66907: Group ungrouped now contains managed_node3 15500 1727096199.66962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15500 1727096199.67043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15500 1727096199.67076: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15500 1727096199.67093: Loaded config def from plugin (vars/host_group_vars) 15500 1727096199.67095: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15500 1727096199.67100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15500 1727096199.67105: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15500 1727096199.67133: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15500 1727096199.67373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096199.67437: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15500 1727096199.67461: Loaded config def from plugin (connection/local) 15500 1727096199.67463: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15500 1727096199.67844: Loaded config def from plugin (connection/paramiko_ssh) 15500 1727096199.67846: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15500 1727096199.68417: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15500 1727096199.68441: Loaded config def from plugin (connection/psrp) 15500 1727096199.68443: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15500 1727096199.68892: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15500 1727096199.68928: Loaded config def from plugin (connection/ssh) 15500 1727096199.68930: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15500 1727096199.70365: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15500 1727096199.70391: Loaded config def from plugin (connection/winrm) 15500 1727096199.70393: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15500 1727096199.70416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15500 1727096199.70482: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15500 1727096199.70527: Loaded config def from plugin (shell/cmd) 15500 1727096199.70529: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15500 1727096199.70558: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15500 1727096199.70633: Loaded config def from plugin (shell/powershell) 15500 1727096199.70635: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15500 1727096199.70697: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15500 1727096199.70915: Loaded config def from plugin (shell/sh) 15500 1727096199.70917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15500 1727096199.70954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15500 1727096199.71102: Loaded config def from plugin (become/runas) 15500 1727096199.71105: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15500 1727096199.71348: Loaded config def from plugin (become/su) 15500 1727096199.71351: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15500 1727096199.71558: Loaded config def from plugin (become/sudo) 15500 1727096199.71561: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15500 1727096199.71601: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15500 1727096199.72131: in VariableManager get_vars() 15500 1727096199.72157: done with get_vars() 15500 1727096199.72354: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15500 1727096199.77291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15500 1727096199.77420: in VariableManager get_vars() 15500 1727096199.77426: done with get_vars() 15500 1727096199.77429: variable 'playbook_dir' from source: magic vars 15500 1727096199.77430: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.77431: variable 'ansible_config_file' from source: magic vars 15500 1727096199.77431: variable 'groups' from source: magic vars 15500 1727096199.77433: variable 'omit' from source: magic vars 15500 1727096199.77434: variable 'ansible_version' from source: magic vars 15500 1727096199.77434: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.77435: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.77436: variable 'ansible_forks' from source: magic vars 15500 1727096199.77437: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.77437: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.77438: variable 'ansible_limit' from source: magic vars 15500 1727096199.77439: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.77439: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.77482: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 15500 1727096199.77940: in VariableManager get_vars() 15500 1727096199.77961: done with get_vars() 15500 1727096199.78002: in VariableManager get_vars() 15500 1727096199.78015: done with get_vars() 15500 1727096199.78045: in VariableManager get_vars() 15500 1727096199.78061: done with get_vars() 15500 1727096199.78138: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15500 1727096199.78363: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15500 1727096199.78498: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15500 1727096199.79230: in VariableManager get_vars() 15500 1727096199.79253: done with get_vars() 15500 1727096199.79891: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15500 1727096199.80028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15500 1727096199.81818: in VariableManager get_vars() 15500 1727096199.81823: done with get_vars() 15500 1727096199.81826: variable 'playbook_dir' from source: magic vars 15500 1727096199.81827: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.81827: variable 'ansible_config_file' from source: magic vars 15500 1727096199.81828: variable 'groups' from source: magic vars 15500 1727096199.81829: variable 'omit' from source: magic vars 15500 1727096199.81829: variable 'ansible_version' from source: magic vars 15500 1727096199.81830: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.81831: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.81833: variable 'ansible_forks' from source: magic vars 15500 1727096199.81833: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.81834: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.81835: variable 'ansible_limit' from source: magic vars 15500 1727096199.81835: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.81836: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.81881: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15500 1727096199.81989: in VariableManager get_vars() 15500 1727096199.82004: done with get_vars() 15500 1727096199.82038: in VariableManager get_vars() 15500 1727096199.82042: done with get_vars() 15500 1727096199.82045: variable 'playbook_dir' from source: magic vars 15500 1727096199.82046: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.82046: variable 'ansible_config_file' from source: magic vars 15500 1727096199.82047: variable 'groups' from source: magic vars 15500 1727096199.82048: variable 'omit' from source: magic vars 15500 1727096199.82049: variable 'ansible_version' from source: magic vars 15500 1727096199.82050: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.82050: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.82051: variable 'ansible_forks' from source: magic vars 15500 1727096199.82052: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.82053: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.82053: variable 'ansible_limit' from source: magic vars 15500 1727096199.82054: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.82055: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.82095: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15500 1727096199.82163: in VariableManager get_vars() 15500 1727096199.82180: done with get_vars() 15500 1727096199.82230: in VariableManager get_vars() 15500 1727096199.82234: done with get_vars() 15500 1727096199.82236: variable 'playbook_dir' from source: magic vars 15500 1727096199.82237: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.82238: variable 'ansible_config_file' from source: magic vars 15500 1727096199.82238: variable 'groups' from source: magic vars 15500 1727096199.82239: variable 'omit' from source: magic vars 15500 1727096199.82240: variable 'ansible_version' from source: magic vars 15500 1727096199.82241: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.82241: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.82242: variable 'ansible_forks' from source: magic vars 15500 1727096199.82248: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.82249: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.82249: variable 'ansible_limit' from source: magic vars 15500 1727096199.82250: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.82251: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.82286: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15500 1727096199.82355: in VariableManager get_vars() 15500 1727096199.82361: done with get_vars() 15500 1727096199.82363: variable 'playbook_dir' from source: magic vars 15500 1727096199.82364: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.82365: variable 'ansible_config_file' from source: magic vars 15500 1727096199.82365: variable 'groups' from source: magic vars 15500 1727096199.82366: variable 'omit' from source: magic vars 15500 1727096199.82369: variable 'ansible_version' from source: magic vars 15500 1727096199.82370: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.82371: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.82371: variable 'ansible_forks' from source: magic vars 15500 1727096199.82372: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.82373: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.82374: variable 'ansible_limit' from source: magic vars 15500 1727096199.82374: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.82375: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.82404: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15500 1727096199.82476: in VariableManager get_vars() 15500 1727096199.82488: done with get_vars() 15500 1727096199.82531: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15500 1727096199.82643: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15500 1727096199.82732: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15500 1727096199.83126: in VariableManager get_vars() 15500 1727096199.83146: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15500 1727096199.84754: in VariableManager get_vars() 15500 1727096199.84778: done with get_vars() 15500 1727096199.84816: in VariableManager get_vars() 15500 1727096199.84820: done with get_vars() 15500 1727096199.84822: variable 'playbook_dir' from source: magic vars 15500 1727096199.84823: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.84824: variable 'ansible_config_file' from source: magic vars 15500 1727096199.84824: variable 'groups' from source: magic vars 15500 1727096199.84825: variable 'omit' from source: magic vars 15500 1727096199.84826: variable 'ansible_version' from source: magic vars 15500 1727096199.84827: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.84827: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.84828: variable 'ansible_forks' from source: magic vars 15500 1727096199.84829: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.84829: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.84830: variable 'ansible_limit' from source: magic vars 15500 1727096199.84831: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.84831: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.84870: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15500 1727096199.84944: in VariableManager get_vars() 15500 1727096199.84960: done with get_vars() 15500 1727096199.85004: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15500 1727096199.85230: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15500 1727096199.85310: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15500 1727096199.87681: in VariableManager get_vars() 15500 1727096199.87705: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15500 1727096199.90104: in VariableManager get_vars() 15500 1727096199.90109: done with get_vars() 15500 1727096199.90111: variable 'playbook_dir' from source: magic vars 15500 1727096199.90112: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.90113: variable 'ansible_config_file' from source: magic vars 15500 1727096199.90113: variable 'groups' from source: magic vars 15500 1727096199.90114: variable 'omit' from source: magic vars 15500 1727096199.90115: variable 'ansible_version' from source: magic vars 15500 1727096199.90116: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.90116: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.90117: variable 'ansible_forks' from source: magic vars 15500 1727096199.90118: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.90119: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.90119: variable 'ansible_limit' from source: magic vars 15500 1727096199.90120: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.90121: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.90155: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15500 1727096199.90436: in VariableManager get_vars() 15500 1727096199.90471: done with get_vars() 15500 1727096199.90508: in VariableManager get_vars() 15500 1727096199.90511: done with get_vars() 15500 1727096199.90513: variable 'playbook_dir' from source: magic vars 15500 1727096199.90514: variable 'ansible_playbook_python' from source: magic vars 15500 1727096199.90515: variable 'ansible_config_file' from source: magic vars 15500 1727096199.90516: variable 'groups' from source: magic vars 15500 1727096199.90516: variable 'omit' from source: magic vars 15500 1727096199.90517: variable 'ansible_version' from source: magic vars 15500 1727096199.90518: variable 'ansible_check_mode' from source: magic vars 15500 1727096199.90518: variable 'ansible_diff_mode' from source: magic vars 15500 1727096199.90519: variable 'ansible_forks' from source: magic vars 15500 1727096199.90520: variable 'ansible_inventory_sources' from source: magic vars 15500 1727096199.90521: variable 'ansible_skip_tags' from source: magic vars 15500 1727096199.90521: variable 'ansible_limit' from source: magic vars 15500 1727096199.90522: variable 'ansible_run_tags' from source: magic vars 15500 1727096199.90523: variable 'ansible_verbosity' from source: magic vars 15500 1727096199.90555: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15500 1727096199.90831: in VariableManager get_vars() 15500 1727096199.90845: done with get_vars() 15500 1727096199.90917: in VariableManager get_vars() 15500 1727096199.90930: done with get_vars() 15500 1727096199.91227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15500 1727096199.91243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15500 1727096199.91701: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15500 1727096199.92085: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15500 1727096199.92093: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 15500 1727096199.92125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15500 1727096199.92150: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15500 1727096199.92531: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15500 1727096199.92799: Loaded config def from plugin (callback/default) 15500 1727096199.92803: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15500 1727096199.95194: Loaded config def from plugin (callback/junit) 15500 1727096199.95198: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15500 1727096199.95250: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15500 1727096199.95322: Loaded config def from plugin (callback/minimal) 15500 1727096199.95325: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15500 1727096199.95570: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15500 1727096199.95633: Loaded config def from plugin (callback/tree) 15500 1727096199.95635: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15500 1727096199.95758: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15500 1727096199.95761: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15500 1727096199.95995: in VariableManager get_vars() 15500 1727096199.96012: done with get_vars() 15500 1727096199.96018: in VariableManager get_vars() 15500 1727096199.96027: done with get_vars() 15500 1727096199.96032: variable 'omit' from source: magic vars 15500 1727096199.96079: in VariableManager get_vars() 15500 1727096199.96095: done with get_vars() 15500 1727096199.96118: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 15500 1727096199.97319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15500 1727096199.97602: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15500 1727096199.97784: getting the remaining hosts for this loop 15500 1727096199.97786: done getting the remaining hosts for this loop 15500 1727096199.97790: getting the next task for host managed_node1 15500 1727096199.97794: done getting next task for host managed_node1 15500 1727096199.97797: ^ task is: TASK: Gathering Facts 15500 1727096199.97799: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096199.97801: getting variables 15500 1727096199.97802: in VariableManager get_vars() 15500 1727096199.97815: Calling all_inventory to load vars for managed_node1 15500 1727096199.97817: Calling groups_inventory to load vars for managed_node1 15500 1727096199.97820: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096199.97833: Calling all_plugins_play to load vars for managed_node1 15500 1727096199.97844: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096199.97847: Calling groups_plugins_play to load vars for managed_node1 15500 1727096199.97889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096199.97948: done with get_vars() 15500 1727096199.97959: done getting variables 15500 1727096199.98332: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Monday 23 September 2024 08:56:39 -0400 (0:00:00.026) 0:00:00.026 ****** 15500 1727096199.98361: entering _queue_task() for managed_node1/gather_facts 15500 1727096199.98363: Creating lock for gather_facts 15500 1727096199.99135: worker is 1 (out of 1 available) 15500 1727096199.99146: exiting _queue_task() for managed_node1/gather_facts 15500 1727096199.99164: done queuing things up, now waiting for results queue to drain 15500 1727096199.99168: waiting for pending results... 15500 1727096199.99718: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096199.99728: in run() - task 0afff68d-5257-877d-2da0-00000000007e 15500 1727096199.99733: variable 'ansible_search_path' from source: unknown 15500 1727096199.99852: calling self._execute() 15500 1727096199.99984: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096200.00039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096200.00054: variable 'omit' from source: magic vars 15500 1727096200.00276: variable 'omit' from source: magic vars 15500 1727096200.00465: variable 'omit' from source: magic vars 15500 1727096200.00470: variable 'omit' from source: magic vars 15500 1727096200.00517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096200.00612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096200.00640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096200.00702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096200.00900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096200.00903: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096200.00906: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096200.00908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096200.01063: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096200.01078: Set connection var ansible_pipelining to False 15500 1727096200.01126: Set connection var ansible_timeout to 10 15500 1727096200.01134: Set connection var ansible_shell_type to sh 15500 1727096200.01144: Set connection var ansible_shell_executable to /bin/sh 15500 1727096200.01153: Set connection var ansible_connection to ssh 15500 1727096200.01276: variable 'ansible_shell_executable' from source: unknown 15500 1727096200.01340: variable 'ansible_connection' from source: unknown 15500 1727096200.01349: variable 'ansible_module_compression' from source: unknown 15500 1727096200.01358: variable 'ansible_shell_type' from source: unknown 15500 1727096200.01366: variable 'ansible_shell_executable' from source: unknown 15500 1727096200.01377: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096200.01387: variable 'ansible_pipelining' from source: unknown 15500 1727096200.01396: variable 'ansible_timeout' from source: unknown 15500 1727096200.01405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096200.01929: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096200.02037: variable 'omit' from source: magic vars 15500 1727096200.02040: starting attempt loop 15500 1727096200.02042: running the handler 15500 1727096200.02045: variable 'ansible_facts' from source: unknown 15500 1727096200.02046: _low_level_execute_command(): starting 15500 1727096200.02048: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096200.03198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096200.03219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096200.03465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096200.03481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096200.03665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096200.05390: stdout chunk (state=3): >>>/root <<< 15500 1727096200.05519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096200.05575: stderr chunk (state=3): >>><<< 15500 1727096200.05579: stdout chunk (state=3): >>><<< 15500 1727096200.05811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096200.05815: _low_level_execute_command(): starting 15500 1727096200.05818: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833 `" && echo ansible-tmp-1727096200.0560358-15525-228131402556833="` echo /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833 `" ) && sleep 0' 15500 1727096200.06789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096200.07000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096200.07014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096200.07035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096200.07083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096200.07096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096200.07192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096200.07285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096200.09291: stdout chunk (state=3): >>>ansible-tmp-1727096200.0560358-15525-228131402556833=/root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833 <<< 15500 1727096200.09504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096200.09549: stderr chunk (state=3): >>><<< 15500 1727096200.09552: stdout chunk (state=3): >>><<< 15500 1727096200.09574: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096200.0560358-15525-228131402556833=/root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096200.09775: variable 'ansible_module_compression' from source: unknown 15500 1727096200.09778: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15500 1727096200.09781: ANSIBALLZ: Acquiring lock 15500 1727096200.09783: ANSIBALLZ: Lock acquired: 140712178847904 15500 1727096200.09785: ANSIBALLZ: Creating module 15500 1727096200.60016: ANSIBALLZ: Writing module into payload 15500 1727096200.60475: ANSIBALLZ: Writing module 15500 1727096200.60479: ANSIBALLZ: Renaming module 15500 1727096200.60481: ANSIBALLZ: Done creating module 15500 1727096200.60490: variable 'ansible_facts' from source: unknown 15500 1727096200.60503: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096200.60515: _low_level_execute_command(): starting 15500 1727096200.60525: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15500 1727096200.61788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096200.62017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096200.62036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096200.62063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096200.62171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096200.63889: stdout chunk (state=3): >>>PLATFORM <<< 15500 1727096200.63960: stdout chunk (state=3): >>>Linux <<< 15500 1727096200.63989: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15500 1727096200.64181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096200.64266: stderr chunk (state=3): >>><<< 15500 1727096200.64279: stdout chunk (state=3): >>><<< 15500 1727096200.64374: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096200.64380 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15500 1727096200.64431: _low_level_execute_command(): starting 15500 1727096200.64628: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15500 1727096200.64700: Sending initial data 15500 1727096200.64710: Sent initial data (1181 bytes) 15500 1727096200.66265: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096200.66457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096200.70155: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 15500 1727096200.70775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096200.70780: stdout chunk (state=3): >>><<< 15500 1727096200.70782: stderr chunk (state=3): >>><<< 15500 1727096200.70785: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096200.70787: variable 'ansible_facts' from source: unknown 15500 1727096200.70789: variable 'ansible_facts' from source: unknown 15500 1727096200.70791: variable 'ansible_module_compression' from source: unknown 15500 1727096200.70816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096200.70859: variable 'ansible_facts' from source: unknown 15500 1727096200.71248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/AnsiballZ_setup.py 15500 1727096200.71718: Sending initial data 15500 1727096200.71722: Sent initial data (154 bytes) 15500 1727096200.72956: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096200.73090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096200.73250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096200.75073: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096200.75130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096200.75270: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp1noa3qm_ /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/AnsiballZ_setup.py <<< 15500 1727096200.75281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/AnsiballZ_setup.py" <<< 15500 1727096200.75407: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp1noa3qm_" to remote "/root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/AnsiballZ_setup.py" <<< 15500 1727096200.78877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096200.78882: stdout chunk (state=3): >>><<< 15500 1727096200.78887: stderr chunk (state=3): >>><<< 15500 1727096200.78980: done transferring module to remote 15500 1727096200.78995: _low_level_execute_command(): starting 15500 1727096200.78998: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/ /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/AnsiballZ_setup.py && sleep 0' 15500 1727096200.80263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096200.80269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096200.80345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096200.80531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096200.80534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096200.80552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096200.80819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096200.82763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096200.82769: stdout chunk (state=3): >>><<< 15500 1727096200.82772: stderr chunk (state=3): >>><<< 15500 1727096200.82789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096200.82792: _low_level_execute_command(): starting 15500 1727096200.82797: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/AnsiballZ_setup.py && sleep 0' 15500 1727096200.84096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096200.84139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096200.84190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096200.84287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096200.84400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096200.86638: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15500 1727096200.86733: stdout chunk (state=3): >>>import _imp # builtin <<< 15500 1727096200.86784: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 15500 1727096200.86801: stdout chunk (state=3): >>>import 'posix' # <<< 15500 1727096200.86873: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15500 1727096200.86888: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 15500 1727096200.87057: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.87064: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095af684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095af37b30> <<< 15500 1727096200.87088: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 15500 1727096200.87092: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095af6aa50> <<< 15500 1727096200.87120: stdout chunk (state=3): >>>import '_signal' # <<< 15500 1727096200.87211: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 15500 1727096200.87290: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15500 1727096200.87402: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 15500 1727096200.87407: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 15500 1727096200.87410: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 15500 1727096200.87483: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad3d130> <<< 15500 1727096200.87543: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.87554: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad3dfa0> <<< 15500 1727096200.87574: stdout chunk (state=3): >>>import 'site' # <<< 15500 1727096200.87682: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15500 1727096200.87991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15500 1727096200.88003: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15500 1727096200.88019: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15500 1727096200.88032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.88084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15500 1727096200.88101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15500 1727096200.88187: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15500 1727096200.88220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad7be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad7bec0> <<< 15500 1727096200.88348: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15500 1727096200.88352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15500 1727096200.88355: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15500 1727096200.88362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.88365: stdout chunk (state=3): >>>import 'itertools' # <<< 15500 1727096200.88387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 15500 1727096200.88540: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095adb37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095adb3e60> <<< 15500 1727096200.88548: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad93ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad911f0> <<< 15500 1727096200.88631: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad78fb0> <<< 15500 1727096200.88663: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15500 1727096200.88677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15500 1727096200.88694: stdout chunk (state=3): >>>import '_sre' # <<< 15500 1727096200.88709: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15500 1727096200.88739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15500 1727096200.88765: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15500 1727096200.88800: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095add3770> <<< 15500 1727096200.88873: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095add2390> <<< 15500 1727096200.88876: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 15500 1727096200.88878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad92090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095add0bc0> <<< 15500 1727096200.89116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae08800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad78230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae08cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae08b60> <<< 15500 1727096200.89120: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.89122: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.89128: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae08ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad76d50> <<< 15500 1727096200.89130: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.89192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15500 1727096200.89221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae09580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae09250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae0a480> import 'importlib.util' # <<< 15500 1727096200.89239: stdout chunk (state=3): >>>import 'runpy' # <<< 15500 1727096200.89282: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15500 1727096200.89295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15500 1727096200.89321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae206b0> <<< 15500 1727096200.89345: stdout chunk (state=3): >>>import 'errno' # <<< 15500 1727096200.89363: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.89421: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae21d90> <<< 15500 1727096200.89454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15500 1727096200.89460: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae22c30> <<< 15500 1727096200.89533: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae23290> <<< 15500 1727096200.89778: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae22180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15500 1727096200.89810: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae23d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae23440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae0a4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab17bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15500 1727096200.89827: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.89875: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab406e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab40440> <<< 15500 1727096200.89880: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab40620> <<< 15500 1727096200.89897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15500 1727096200.89912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15500 1727096200.90006: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.90114: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab40fe0> <<< 15500 1727096200.90253: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab41970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab40890> <<< 15500 1727096200.90270: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab15d60> <<< 15500 1727096200.90290: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15500 1727096200.90320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15500 1727096200.90336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15500 1727096200.90359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15500 1727096200.90370: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab42cf0> <<< 15500 1727096200.90389: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab40e60> <<< 15500 1727096200.90441: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae0abd0> <<< 15500 1727096200.90465: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15500 1727096200.90497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.90566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15500 1727096200.90586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab6f020> <<< 15500 1727096200.90646: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15500 1727096200.90736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15500 1727096200.90740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15500 1727096200.90747: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab933e0> <<< 15500 1727096200.90761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15500 1727096200.90816: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15500 1727096200.90863: stdout chunk (state=3): >>>import 'ntpath' # <<< 15500 1727096200.90913: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abf01a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15500 1727096200.91099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15500 1727096200.91102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15500 1727096200.91105: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abf28d0> <<< 15500 1727096200.91178: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abf02c0> <<< 15500 1727096200.91224: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abbd190> <<< 15500 1727096200.91337: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5291f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab921e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab43bf0> <<< 15500 1727096200.91448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15500 1727096200.91466: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f095ab928a0> <<< 15500 1727096200.91728: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_t1a1kszw/ansible_ansible.legacy.setup_payload.zip' <<< 15500 1727096200.91745: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.91876: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.91889: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15500 1727096200.91993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15500 1727096200.92083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15500 1727096200.92088: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 15500 1727096200.92091: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a58af00> import '_typing' # <<< 15500 1727096200.92259: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a569df0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a568f50> # zipimport: zlib available <<< 15500 1727096200.92334: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 15500 1727096200.92572: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.93747: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.94888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 15500 1727096200.95051: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a588dd0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a5c2780> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c2510> <<< 15500 1727096200.95061: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c1e20> <<< 15500 1727096200.95085: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15500 1727096200.95123: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c2840> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a58bb90> <<< 15500 1727096200.95160: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.95169: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a5c34a0> <<< 15500 1727096200.95251: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a5c3650> <<< 15500 1727096200.95273: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 15500 1727096200.95419: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c3b90> import 'pwd' # <<< 15500 1727096200.95423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15500 1727096200.95426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15500 1727096200.95466: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a425970> <<< 15500 1727096200.95484: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a427590> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15500 1727096200.95504: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a427f20> <<< 15500 1727096200.95529: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15500 1727096200.95588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15500 1727096200.95649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15500 1727096200.95652: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15500 1727096200.95793: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42fbc0> <<< 15500 1727096200.95858: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a56aff0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42de80> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15500 1727096200.95861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15500 1727096200.95864: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15500 1727096200.95973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15500 1727096200.95994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a433b90> <<< 15500 1727096200.96072: stdout chunk (state=3): >>>import '_tokenize' # <<< 15500 1727096200.96081: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a432660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4323c0> <<< 15500 1727096200.96117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15500 1727096200.96183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15500 1727096200.96194: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a432930> <<< 15500 1727096200.96213: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42e390> <<< 15500 1727096200.96299: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a477e90> <<< 15500 1727096200.96304: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a477890> <<< 15500 1727096200.96311: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15500 1727096200.96372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a479a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a479820> <<< 15500 1727096200.96518: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.96521: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a47bfe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a47a150> <<< 15500 1727096200.96524: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15500 1727096200.96534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.96624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15500 1727096200.96633: stdout chunk (state=3): >>>import '_string' # <<< 15500 1727096200.96661: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a47f680> <<< 15500 1727096200.96741: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a47c140> <<< 15500 1727096200.96802: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.96892: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a480a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a4805c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a480a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4781d0> <<< 15500 1727096200.96916: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15500 1727096200.96951: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15500 1727096200.96954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15500 1727096200.97007: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a30c0b0> <<< 15500 1727096200.97373: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096200.97377: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a30d400> <<< 15500 1727096200.97379: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a482810> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a483bc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a482480> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15500 1727096200.97382: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.97426: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.97467: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 15500 1727096200.97548: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15500 1727096200.97663: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.97723: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.98284: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.98871: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15500 1727096200.98875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096200.98954: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a3115e0> <<< 15500 1727096200.99010: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15500 1727096200.99138: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a312330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a482660> <<< 15500 1727096200.99141: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15500 1727096200.99187: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15500 1727096200.99290: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096200.99446: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15500 1727096200.99463: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3123f0> # zipimport: zlib available <<< 15500 1727096200.99924: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.00372: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.00429: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.00582: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 15500 1727096201.00602: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 15500 1727096201.00665: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.00771: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15500 1727096201.00774: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.00872: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15500 1727096201.00886: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15500 1727096201.01091: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.01322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15500 1727096201.01424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15500 1727096201.01427: stdout chunk (state=3): >>>import '_ast' # <<< 15500 1727096201.01470: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a313560> # zipimport: zlib available <<< 15500 1727096201.01569: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.01658: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15500 1727096201.01788: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 15500 1727096201.01794: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.01862: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.01880: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.01978: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15500 1727096201.01989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096201.02273: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096201.02276: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a31df10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a319850> <<< 15500 1727096201.02279: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15500 1727096201.02281: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.02284: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.02286: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.02315: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.02350: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 15500 1727096201.02371: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15500 1727096201.02418: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15500 1727096201.02524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15500 1727096201.02542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15500 1727096201.02564: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a406870> <<< 15500 1727096201.02611: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4fa540> <<< 15500 1727096201.02744: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a31dc70> <<< 15500 1727096201.02765: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4814f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15500 1727096201.02814: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15500 1727096201.02949: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.02961: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 15500 1727096201.02964: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03000: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.03064: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03083: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03101: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03135: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03217: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15500 1727096201.03252: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03374: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03385: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03388: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15500 1727096201.03607: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03741: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03779: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.03929: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096201.03935: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 15500 1727096201.03938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15500 1727096201.03969: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15500 1727096201.03988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15500 1727096201.04033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15500 1727096201.04055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15500 1727096201.04364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f3ff50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959f442f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a39b020> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b2c30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b07a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b0440> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15500 1727096201.04369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15500 1727096201.04372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 15500 1727096201.04376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15500 1727096201.04379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 15500 1727096201.04381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15500 1727096201.04473: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096201.04476: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959f47380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f46c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959f46e10> <<< 15500 1727096201.04478: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f46060> <<< 15500 1727096201.04481: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15500 1727096201.04629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15500 1727096201.04648: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f47500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15500 1727096201.04676: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959faa000> <<< 15500 1727096201.04737: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f47f50> <<< 15500 1727096201.04762: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b04a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 15500 1727096201.04854: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.04911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15500 1727096201.04922: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.04978: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15500 1727096201.05063: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 15500 1727096201.05071: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05130: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05133: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 15500 1727096201.05136: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05187: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05242: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 15500 1727096201.05345: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05348: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15500 1727096201.05351: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05399: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05450: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05508: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.05572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 15500 1727096201.05671: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.06046: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.06475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15500 1727096201.06486: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.06590: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.06613: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.06651: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 15500 1727096201.06665: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 15500 1727096201.06755: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.06760: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15500 1727096201.06814: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.06837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15500 1727096201.06855: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.06881: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.06920: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 15500 1727096201.06929: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.07081: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15500 1727096201.07086: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.07152: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15500 1727096201.07163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15500 1727096201.07181: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959fab4d0> <<< 15500 1727096201.07258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15500 1727096201.07406: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959faaba0> <<< 15500 1727096201.07413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15500 1727096201.07427: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.07579: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 15500 1727096201.07597: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.07675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15500 1727096201.07693: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.07849: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.07864: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 15500 1727096201.07874: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.07924: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15500 1727096201.08010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15500 1727096201.08126: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959fe2270> <<< 15500 1727096201.08290: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959fd2ea0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 15500 1727096201.08358: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.08403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15500 1727096201.08420: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.08562: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.08582: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.08690: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.08959: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.08963: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15500 1727096201.08965: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.08970: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.09014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 15500 1727096201.09293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959ff5a60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959fe3bc0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 15500 1727096201.09358: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.09513: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15500 1727096201.09523: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.09623: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.09716: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.09758: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.09794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15500 1727096201.09872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.09875: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.10051: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.10128: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 15500 1727096201.10149: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15500 1727096201.10271: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.10389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 15500 1727096201.10472: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.10475: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.11025: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.11531: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 15500 1727096201.11548: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15500 1727096201.11651: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.11984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.12272: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # <<< 15500 1727096201.12307: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15500 1727096201.12322: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.12358: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.12402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 15500 1727096201.12416: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.12680: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.12854: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15500 1727096201.13019: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13047: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15500 1727096201.13285: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15500 1727096201.13289: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13395: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 15500 1727096201.13408: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13470: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 15500 1727096201.13483: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13536: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13592: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15500 1727096201.13609: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.13858: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 15500 1727096201.14276: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.14315: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15500 1727096201.14332: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14354: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14394: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15500 1727096201.14431: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14439: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 15500 1727096201.14582: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14641: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15500 1727096201.14708: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 15500 1727096201.14726: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.14769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 15500 1727096201.14814: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14862: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096201.14920: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.14976: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.15084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15500 1727096201.15115: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.15259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15500 1727096201.15366: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.15560: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15500 1727096201.15581: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.15697: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 15500 1727096201.15707: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.15760: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15500 1727096201.15799: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.15873: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.15939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 15500 1727096201.16128: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.16149: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15500 1727096201.16196: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096201.16572: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15500 1727096201.16576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959d8f5f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959d8c260> <<< 15500 1727096201.16578: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959d8c320> <<< 15500 1727096201.31418: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959dd5130> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959dd5eb0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959e20500> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959e230e0> <<< 15500 1727096201.31709: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15500 1727096201.51817: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_fi<<< 15500 1727096201.51842: stdout chunk (state=3): >>>le_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "41", "epoch": "1727096201", "epoch_int": "1727096201", "date": "2024-09-23", "time": "08:56:41", "iso8601_micro": "2024-09-23T12:56:41.175956Z", "iso8601": "2024-09-23T12:56:41Z", "iso8601_basic": "20240923T085641175956", "iso8601_basic_short": "20240923T085641", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.36572265625, "5m": 0.296875, "15m": 0.13818359375}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "<<< 15500 1727096201.51875: stdout chunk (state=3): >>>tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_ve<<< 15500 1727096201.51896: stdout chunk (state=3): >>>rsion": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 354, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797654528, "block_size": 4096, "block_total": 65519099, "block_available": 63915443, "block_used": 1603656, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096201.52855: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref<<< 15500 1727096201.52913: stdout chunk (state=3): >>> # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser <<< 15500 1727096201.53036: stdout chunk (state=3): >>># cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctyp<<< 15500 1727096201.53233: stdout chunk (state=3): >>>es # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse <<< 15500 1727096201.53237: stdout chunk (state=3): >>># cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd<<< 15500 1727096201.53273: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform<<< 15500 1727096201.53292: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly<<< 15500 1727096201.53345: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base<<< 15500 1727096201.53350: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual <<< 15500 1727096201.53372: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna<<< 15500 1727096201.53773: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15500 1727096201.54006: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15500 1727096201.54075: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2<<< 15500 1727096201.54137: stdout chunk (state=3): >>> # destroy _compression # destroy _lzma # destroy _blake2<<< 15500 1727096201.54169: stdout chunk (state=3): >>> # destroy binascii <<< 15500 1727096201.54233: stdout chunk (state=3): >>># destroy zlib # destroy bz2 <<< 15500 1727096201.54310: stdout chunk (state=3): >>># destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 15500 1727096201.54360: stdout chunk (state=3): >>> # destroy ntpath <<< 15500 1727096201.54386: stdout chunk (state=3): >>># destroy importlib <<< 15500 1727096201.54415: stdout chunk (state=3): >>># destroy zipimport <<< 15500 1727096201.54434: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal<<< 15500 1727096201.54473: stdout chunk (state=3): >>> # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json<<< 15500 1727096201.54491: stdout chunk (state=3): >>> <<< 15500 1727096201.54503: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale<<< 15500 1727096201.54550: stdout chunk (state=3): >>> # destroy locale<<< 15500 1727096201.54560: stdout chunk (state=3): >>> # destroy select # destroy _signal<<< 15500 1727096201.54579: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog<<< 15500 1727096201.54704: stdout chunk (state=3): >>> # destroy uuid # destroy selinux # destroy shutil <<< 15500 1727096201.54719: stdout chunk (state=3): >>># destroy distro <<< 15500 1727096201.54776: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy logging <<< 15500 1727096201.54816: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing<<< 15500 1727096201.54842: stdout chunk (state=3): >>> # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle<<< 15500 1727096201.54866: stdout chunk (state=3): >>> # destroy _pickle <<< 15500 1727096201.54946: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction<<< 15500 1727096201.54996: stdout chunk (state=3): >>> # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 15500 1727096201.55216: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct <<< 15500 1727096201.55321: stdout chunk (state=3): >>># cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 15500 1727096201.55330: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 15500 1727096201.55437: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15500 1727096201.55505: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15500 1727096201.55537: stdout chunk (state=3): >>># destroy _collections <<< 15500 1727096201.55570: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15500 1727096201.55664: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15500 1727096201.55884: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15500 1727096201.56282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096201.56296: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 15500 1727096201.56396: stderr chunk (state=3): >>><<< 15500 1727096201.56412: stdout chunk (state=3): >>><<< 15500 1727096201.56690: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095af684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095af37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095af6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad3d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad3dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad7be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad7bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095adb37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095adb3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad93ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad911f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad78fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095add3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095add2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad92090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095add0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae08800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad78230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae08cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae08b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae08ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ad76d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae09580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae09250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae0a480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae206b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae21d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae22c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae23290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae22180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ae23d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae23440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae0a4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab17bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab406e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab40440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab40620> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab40fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095ab41970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab40890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab15d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab42cf0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab40e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ae0abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab6f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab933e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abf01a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abf28d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abf02c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095abbd190> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5291f0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab921e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095ab43bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f095ab928a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_t1a1kszw/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a58af00> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a569df0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a568f50> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a588dd0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a5c2780> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c2510> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c1e20> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c2840> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a58bb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a5c34a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a5c3650> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a5c3b90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a425970> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a427590> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a427f20> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42fbc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a56aff0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42de80> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a433b90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a432660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4323c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a432930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a42e390> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a477e90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a477890> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a479a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a479820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a47bfe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a47a150> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a47f680> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a47c140> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a480a40> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a4805c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a480a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4781d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a30c0b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a30d400> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a482810> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a483bc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a482480> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a3115e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a312330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a482660> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3123f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a313560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f095a31df10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a319850> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a406870> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4fa540> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a31dc70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a4814f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b20c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f3ff50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959f442f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a39b020> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b2c30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b07a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b0440> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959f47380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f46c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959f46e10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f46060> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f47500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959faa000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959f47f50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f095a3b04a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959fab4d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959faaba0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959fe2270> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959fd2ea0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959ff5a60> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959fe3bc0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0959d8f5f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959d8c260> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959d8c320> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959dd5130> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959dd5eb0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959e20500> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0959e230e0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "41", "epoch": "1727096201", "epoch_int": "1727096201", "date": "2024-09-23", "time": "08:56:41", "iso8601_micro": "2024-09-23T12:56:41.175956Z", "iso8601": "2024-09-23T12:56:41Z", "iso8601_basic": "20240923T085641175956", "iso8601_basic_short": "20240923T085641", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.36572265625, "5m": 0.296875, "15m": 0.13818359375}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 354, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797654528, "block_size": 4096, "block_total": 65519099, "block_available": 63915443, "block_used": 1603656, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_local": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15500 1727096201.59236: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096201.59282: _low_level_execute_command(): starting 15500 1727096201.59292: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096200.0560358-15525-228131402556833/ > /dev/null 2>&1 && sleep 0' 15500 1727096201.59987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096201.59991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096201.60061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096201.60101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096201.60121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096201.60157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096201.60425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096201.62759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096201.62763: stdout chunk (state=3): >>><<< 15500 1727096201.62766: stderr chunk (state=3): >>><<< 15500 1727096201.62771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096201.62774: handler run complete 15500 1727096201.62807: variable 'ansible_facts' from source: unknown 15500 1727096201.62953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096201.63352: variable 'ansible_facts' from source: unknown 15500 1727096201.63473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096201.63680: attempt loop complete, returning result 15500 1727096201.63694: _execute() done 15500 1727096201.63702: dumping result to json 15500 1727096201.63779: done dumping result, returning 15500 1727096201.63793: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-00000000007e] 15500 1727096201.63803: sending task result for task 0afff68d-5257-877d-2da0-00000000007e 15500 1727096201.65363: done sending task result for task 0afff68d-5257-877d-2da0-00000000007e 15500 1727096201.65372: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096201.65889: no more pending results, returning what we have 15500 1727096201.65892: results queue empty 15500 1727096201.65893: checking for any_errors_fatal 15500 1727096201.65895: done checking for any_errors_fatal 15500 1727096201.65896: checking for max_fail_percentage 15500 1727096201.65897: done checking for max_fail_percentage 15500 1727096201.65898: checking to see if all hosts have failed and the running result is not ok 15500 1727096201.65899: done checking to see if all hosts have failed 15500 1727096201.65900: getting the remaining hosts for this loop 15500 1727096201.65902: done getting the remaining hosts for this loop 15500 1727096201.65906: getting the next task for host managed_node1 15500 1727096201.65912: done getting next task for host managed_node1 15500 1727096201.65913: ^ task is: TASK: meta (flush_handlers) 15500 1727096201.65916: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096201.65919: getting variables 15500 1727096201.65945: in VariableManager get_vars() 15500 1727096201.65980: Calling all_inventory to load vars for managed_node1 15500 1727096201.65989: Calling groups_inventory to load vars for managed_node1 15500 1727096201.65994: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096201.66078: Calling all_plugins_play to load vars for managed_node1 15500 1727096201.66082: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096201.66086: Calling groups_plugins_play to load vars for managed_node1 15500 1727096201.66435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096201.66940: done with get_vars() 15500 1727096201.66953: done getting variables 15500 1727096201.67079: in VariableManager get_vars() 15500 1727096201.67090: Calling all_inventory to load vars for managed_node1 15500 1727096201.67092: Calling groups_inventory to load vars for managed_node1 15500 1727096201.67095: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096201.67100: Calling all_plugins_play to load vars for managed_node1 15500 1727096201.67102: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096201.67105: Calling groups_plugins_play to load vars for managed_node1 15500 1727096201.67524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096201.67947: done with get_vars() 15500 1727096201.67969: done queuing things up, now waiting for results queue to drain 15500 1727096201.67972: results queue empty 15500 1727096201.67973: checking for any_errors_fatal 15500 1727096201.67976: done checking for any_errors_fatal 15500 1727096201.67976: checking for max_fail_percentage 15500 1727096201.67977: done checking for max_fail_percentage 15500 1727096201.67983: checking to see if all hosts have failed and the running result is not ok 15500 1727096201.68023: done checking to see if all hosts have failed 15500 1727096201.68024: getting the remaining hosts for this loop 15500 1727096201.68025: done getting the remaining hosts for this loop 15500 1727096201.68029: getting the next task for host managed_node1 15500 1727096201.68034: done getting next task for host managed_node1 15500 1727096201.68037: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15500 1727096201.68039: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096201.68041: getting variables 15500 1727096201.68042: in VariableManager get_vars() 15500 1727096201.68052: Calling all_inventory to load vars for managed_node1 15500 1727096201.68054: Calling groups_inventory to load vars for managed_node1 15500 1727096201.68058: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096201.68102: Calling all_plugins_play to load vars for managed_node1 15500 1727096201.68106: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096201.68110: Calling groups_plugins_play to load vars for managed_node1 15500 1727096201.68298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096201.68551: done with get_vars() 15500 1727096201.68562: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Monday 23 September 2024 08:56:41 -0400 (0:00:01.702) 0:00:01.729 ****** 15500 1727096201.68664: entering _queue_task() for managed_node1/include_tasks 15500 1727096201.68666: Creating lock for include_tasks 15500 1727096201.69037: worker is 1 (out of 1 available) 15500 1727096201.69050: exiting _queue_task() for managed_node1/include_tasks 15500 1727096201.69066: done queuing things up, now waiting for results queue to drain 15500 1727096201.69070: waiting for pending results... 15500 1727096201.69487: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 15500 1727096201.69493: in run() - task 0afff68d-5257-877d-2da0-000000000006 15500 1727096201.69496: variable 'ansible_search_path' from source: unknown 15500 1727096201.69498: calling self._execute() 15500 1727096201.69631: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096201.69638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096201.69641: variable 'omit' from source: magic vars 15500 1727096201.69701: _execute() done 15500 1727096201.69710: dumping result to json 15500 1727096201.69738: done dumping result, returning 15500 1727096201.69742: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-877d-2da0-000000000006] 15500 1727096201.69745: sending task result for task 0afff68d-5257-877d-2da0-000000000006 15500 1727096201.70012: done sending task result for task 0afff68d-5257-877d-2da0-000000000006 15500 1727096201.70016: WORKER PROCESS EXITING 15500 1727096201.70073: no more pending results, returning what we have 15500 1727096201.70086: in VariableManager get_vars() 15500 1727096201.70298: Calling all_inventory to load vars for managed_node1 15500 1727096201.70303: Calling groups_inventory to load vars for managed_node1 15500 1727096201.70308: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096201.70323: Calling all_plugins_play to load vars for managed_node1 15500 1727096201.70327: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096201.70330: Calling groups_plugins_play to load vars for managed_node1 15500 1727096201.70824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096201.71026: done with get_vars() 15500 1727096201.71036: variable 'ansible_search_path' from source: unknown 15500 1727096201.71053: we have included files to process 15500 1727096201.71054: generating all_blocks data 15500 1727096201.71055: done generating all_blocks data 15500 1727096201.71059: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15500 1727096201.71060: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15500 1727096201.71063: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15500 1727096201.72148: in VariableManager get_vars() 15500 1727096201.72171: done with get_vars() 15500 1727096201.72186: done processing included file 15500 1727096201.72188: iterating over new_blocks loaded from include file 15500 1727096201.72190: in VariableManager get_vars() 15500 1727096201.72199: done with get_vars() 15500 1727096201.72201: filtering new block on tags 15500 1727096201.72223: done filtering new block on tags 15500 1727096201.72226: in VariableManager get_vars() 15500 1727096201.72237: done with get_vars() 15500 1727096201.72239: filtering new block on tags 15500 1727096201.72262: done filtering new block on tags 15500 1727096201.72265: in VariableManager get_vars() 15500 1727096201.72278: done with get_vars() 15500 1727096201.72279: filtering new block on tags 15500 1727096201.72291: done filtering new block on tags 15500 1727096201.72293: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 15500 1727096201.72299: extending task lists for all hosts with included blocks 15500 1727096201.72361: done extending task lists 15500 1727096201.72363: done processing included files 15500 1727096201.72363: results queue empty 15500 1727096201.72364: checking for any_errors_fatal 15500 1727096201.72366: done checking for any_errors_fatal 15500 1727096201.72366: checking for max_fail_percentage 15500 1727096201.72369: done checking for max_fail_percentage 15500 1727096201.72370: checking to see if all hosts have failed and the running result is not ok 15500 1727096201.72370: done checking to see if all hosts have failed 15500 1727096201.72371: getting the remaining hosts for this loop 15500 1727096201.72372: done getting the remaining hosts for this loop 15500 1727096201.72375: getting the next task for host managed_node1 15500 1727096201.72380: done getting next task for host managed_node1 15500 1727096201.72382: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15500 1727096201.72385: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096201.72387: getting variables 15500 1727096201.72388: in VariableManager get_vars() 15500 1727096201.72396: Calling all_inventory to load vars for managed_node1 15500 1727096201.72398: Calling groups_inventory to load vars for managed_node1 15500 1727096201.72400: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096201.72406: Calling all_plugins_play to load vars for managed_node1 15500 1727096201.72408: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096201.72411: Calling groups_plugins_play to load vars for managed_node1 15500 1727096201.72590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096201.72803: done with get_vars() 15500 1727096201.72813: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:56:41 -0400 (0:00:00.042) 0:00:01.772 ****** 15500 1727096201.72895: entering _queue_task() for managed_node1/setup 15500 1727096201.73336: worker is 1 (out of 1 available) 15500 1727096201.73347: exiting _queue_task() for managed_node1/setup 15500 1727096201.73359: done queuing things up, now waiting for results queue to drain 15500 1727096201.73360: waiting for pending results... 15500 1727096201.73598: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 15500 1727096201.73673: in run() - task 0afff68d-5257-877d-2da0-00000000008f 15500 1727096201.73677: variable 'ansible_search_path' from source: unknown 15500 1727096201.73679: variable 'ansible_search_path' from source: unknown 15500 1727096201.73708: calling self._execute() 15500 1727096201.73791: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096201.73811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096201.73825: variable 'omit' from source: magic vars 15500 1727096201.74455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096201.76725: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096201.76813: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096201.76864: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096201.76923: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096201.76963: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096201.77063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096201.77273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096201.77276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096201.77279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096201.77282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096201.77426: variable 'ansible_facts' from source: unknown 15500 1727096201.77513: variable 'network_test_required_facts' from source: task vars 15500 1727096201.77559: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15500 1727096201.77576: variable 'omit' from source: magic vars 15500 1727096201.77629: variable 'omit' from source: magic vars 15500 1727096201.77674: variable 'omit' from source: magic vars 15500 1727096201.77705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096201.77749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096201.77779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096201.77801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096201.77816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096201.77864: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096201.77945: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096201.77948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096201.77988: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096201.77997: Set connection var ansible_pipelining to False 15500 1727096201.78005: Set connection var ansible_timeout to 10 15500 1727096201.78010: Set connection var ansible_shell_type to sh 15500 1727096201.78018: Set connection var ansible_shell_executable to /bin/sh 15500 1727096201.78025: Set connection var ansible_connection to ssh 15500 1727096201.78059: variable 'ansible_shell_executable' from source: unknown 15500 1727096201.78066: variable 'ansible_connection' from source: unknown 15500 1727096201.78075: variable 'ansible_module_compression' from source: unknown 15500 1727096201.78080: variable 'ansible_shell_type' from source: unknown 15500 1727096201.78085: variable 'ansible_shell_executable' from source: unknown 15500 1727096201.78091: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096201.78097: variable 'ansible_pipelining' from source: unknown 15500 1727096201.78102: variable 'ansible_timeout' from source: unknown 15500 1727096201.78108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096201.78258: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096201.78372: variable 'omit' from source: magic vars 15500 1727096201.78377: starting attempt loop 15500 1727096201.78379: running the handler 15500 1727096201.78381: _low_level_execute_command(): starting 15500 1727096201.78383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096201.79098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096201.79118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096201.79134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096201.79275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096201.79298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096201.79340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096201.79418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096201.81800: stdout chunk (state=3): >>>/root <<< 15500 1727096201.81983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096201.81998: stdout chunk (state=3): >>><<< 15500 1727096201.82016: stderr chunk (state=3): >>><<< 15500 1727096201.82049: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15500 1727096201.82081: _low_level_execute_command(): starting 15500 1727096201.82092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767 `" && echo ansible-tmp-1727096201.8206844-15580-176973298512767="` echo /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767 `" ) && sleep 0' 15500 1727096201.82774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096201.82792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096201.82807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096201.82923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096201.82927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096201.82968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096201.83087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096201.85859: stdout chunk (state=3): >>>ansible-tmp-1727096201.8206844-15580-176973298512767=/root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767 <<< 15500 1727096201.86053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096201.86147: stderr chunk (state=3): >>><<< 15500 1727096201.86175: stdout chunk (state=3): >>><<< 15500 1727096201.86280: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096201.8206844-15580-176973298512767=/root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15500 1727096201.86283: variable 'ansible_module_compression' from source: unknown 15500 1727096201.86325: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096201.86396: variable 'ansible_facts' from source: unknown 15500 1727096201.86736: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/AnsiballZ_setup.py 15500 1727096201.87009: Sending initial data 15500 1727096201.87013: Sent initial data (154 bytes) 15500 1727096201.87503: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096201.87507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096201.87510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096201.87512: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096201.87514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096201.87581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096201.87783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096201.87860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096201.90136: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096201.90233: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096201.90338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmptdfmsroc /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/AnsiballZ_setup.py <<< 15500 1727096201.90341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/AnsiballZ_setup.py" <<< 15500 1727096201.90455: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmptdfmsroc" to remote "/root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/AnsiballZ_setup.py" <<< 15500 1727096201.92506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096201.92602: stderr chunk (state=3): >>><<< 15500 1727096201.92637: stdout chunk (state=3): >>><<< 15500 1727096201.92672: done transferring module to remote 15500 1727096201.92694: _low_level_execute_command(): starting 15500 1727096201.92705: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/ /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/AnsiballZ_setup.py && sleep 0' 15500 1727096201.93414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096201.93531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096201.93554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096201.93661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096201.95653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096201.96004: stdout chunk (state=3): >>><<< 15500 1727096201.96009: stderr chunk (state=3): >>><<< 15500 1727096201.96013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15500 1727096201.96015: _low_level_execute_command(): starting 15500 1727096201.96017: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/AnsiballZ_setup.py && sleep 0' 15500 1727096201.97394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096201.97414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096201.97519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096201.99965: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15500 1727096201.99995: stdout chunk (state=3): >>>import '_io' # <<< 15500 1727096202.00001: stdout chunk (state=3): >>>import 'marshal' # <<< 15500 1727096202.00024: stdout chunk (state=3): >>>import 'posix' # <<< 15500 1727096202.00089: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15500 1727096202.00092: stdout chunk (state=3): >>>import 'time' # <<< 15500 1727096202.00160: stdout chunk (state=3): >>>import 'zipimport' # <<< 15500 1727096202.00171: stdout chunk (state=3): >>># installed zipimport hook <<< 15500 1727096202.00174: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.00177: stdout chunk (state=3): >>>import '_codecs' # <<< 15500 1727096202.00263: stdout chunk (state=3): >>>import 'codecs' # <<< 15500 1727096202.00375: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15500 1727096202.00381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419d104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419cdfb30> <<< 15500 1727096202.00388: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419d12a50> <<< 15500 1727096202.00549: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 15500 1727096202.00594: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 15500 1727096202.00599: stdout chunk (state=3): >>>import 'os' # <<< 15500 1727096202.00602: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15500 1727096202.00604: stdout chunk (state=3): >>>Processing user site-packages <<< 15500 1727096202.00606: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 15500 1727096202.00608: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 15500 1727096202.00723: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15500 1727096202.00727: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15500 1727096202.00729: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b05130> <<< 15500 1727096202.00732: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15500 1727096202.00734: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b05fa0> <<< 15500 1727096202.00825: stdout chunk (state=3): >>>import 'site' # <<< 15500 1727096202.00913: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15500 1727096202.01361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.01370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15500 1727096202.01386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15500 1727096202.01434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b43ec0> <<< 15500 1727096202.01475: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15500 1727096202.01481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15500 1727096202.01537: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b43f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15500 1727096202.01630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15500 1727096202.01701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 15500 1727096202.01706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b7b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 15500 1727096202.01737: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b7bec0> import '_collections' # <<< 15500 1727096202.01882: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b5bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b592b0> <<< 15500 1727096202.01974: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b41070> <<< 15500 1727096202.02003: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15500 1727096202.02033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 15500 1727096202.02091: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15500 1727096202.02124: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15500 1727096202.02280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b9b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b9a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b5a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b98bc0> <<< 15500 1727096202.02284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15500 1727096202.02333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b402f0> <<< 15500 1727096202.02360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419bd0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd0bf0> <<< 15500 1727096202.02442: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419bd0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b3ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 15500 1727096202.02446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.02510: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15500 1727096202.02791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd2540> import 'importlib.util' # import 'runpy' # <<< 15500 1727096202.02832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419be8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419be9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15500 1727096202.02836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419beacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.02838: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419beb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bea210> <<< 15500 1727096202.02901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15500 1727096202.02936: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 15500 1727096202.03026: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419bebd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419beb4a0><<< 15500 1727096202.03059: stdout chunk (state=3): >>> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd24b0> <<< 15500 1727096202.03104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15500 1727096202.03153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15500 1727096202.03310: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34198e7c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15500 1727096202.03364: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.03370: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34199107a0> <<< 15500 1727096202.03388: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419910500><<< 15500 1727096202.03394: stdout chunk (state=3): >>> <<< 15500 1727096202.03464: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 15500 1727096202.03469: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34199107d0><<< 15500 1727096202.03519: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py<<< 15500 1727096202.03529: stdout chunk (state=3): >>> <<< 15500 1727096202.03532: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 15500 1727096202.03691: stdout chunk (state=3): >>> <<< 15500 1727096202.03695: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.03903: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419911100> <<< 15500 1727096202.04062: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 15500 1727096202.04105: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419911af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199109b0><<< 15500 1727096202.04134: stdout chunk (state=3): >>> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34198e5df0> <<< 15500 1727096202.04190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 15500 1727096202.04248: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15500 1727096202.04269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419912f00> <<< 15500 1727096202.04351: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419911c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd2c60> <<< 15500 1727096202.04675: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341993b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15500 1727096202.04789: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341995f5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199c0380> <<< 15500 1727096202.04797: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15500 1727096202.04817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15500 1727096202.04832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15500 1727096202.04878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15500 1727096202.05286: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199c2ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199c04a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419981370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34197c1430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341995e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419913e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15500 1727096202.05311: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f341995e750> <<< 15500 1727096202.05639: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_4aerw5pr/ansible_setup_payload.zip' <<< 15500 1727096202.05642: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.05758: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.05793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15500 1727096202.05839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15500 1727096202.06053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341982b170> import '_typing' # <<< 15500 1727096202.06130: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341980a060> <<< 15500 1727096202.06147: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34198091c0> # zipimport: zlib available <<< 15500 1727096202.06168: stdout chunk (state=3): >>>import 'ansible' # <<< 15500 1727096202.06224: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 15500 1727096202.06236: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.07701: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.08776: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 15500 1727096202.08782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419829040> <<< 15500 1727096202.08793: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15500 1727096202.08805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.09070: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341985ab10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985a8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985a1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985aba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341982be00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341985b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.09114: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341985b9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15500 1727096202.09225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985bef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15500 1727096202.09240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15500 1727096202.09284: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419129c10> <<< 15500 1727096202.09302: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.09331: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341912b830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15500 1727096202.09343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15500 1727096202.09615: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912fe60> <<< 15500 1727096202.09619: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.09622: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419912e70> <<< 15500 1727096202.09642: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912e120> <<< 15500 1727096202.09660: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15500 1727096202.09675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15500 1727096202.09772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15500 1727096202.09950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419137d10> import '_tokenize' # <<< 15500 1727096202.09958: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34191367e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419136540> <<< 15500 1727096202.09977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15500 1727096202.10201: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419136ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912e630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341917bf50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15500 1727096202.10379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341917db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419180110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917e240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15500 1727096202.10395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.10429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15500 1727096202.10432: stdout chunk (state=3): >>>import '_string' # <<< 15500 1727096202.10478: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34191838c0> <<< 15500 1727096202.10597: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419180290> <<< 15500 1727096202.10652: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.10669: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419184980> <<< 15500 1727096202.10687: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.10752: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419184920> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419184bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917c230> <<< 15500 1727096202.10774: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15500 1727096202.10888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34190102f0> <<< 15500 1727096202.11001: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.11036: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419011880> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419186a80> <<< 15500 1727096202.11064: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419187e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34191866c0> <<< 15500 1727096202.11072: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.11175: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15500 1727096202.11188: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.11358: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15500 1727096202.11573: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.12106: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.12646: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 15500 1727096202.12687: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15500 1727096202.12826: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419015a90> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15500 1727096202.12840: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190167e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199119d0> <<< 15500 1727096202.12906: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15500 1727096202.12909: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.12924: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.12944: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15500 1727096202.13186: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.13275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419016810> # zipimport: zlib available <<< 15500 1727096202.13792: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.14282: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.14303: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15500 1727096202.14485: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.14584: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15500 1727096202.14589: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.14592: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.14594: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 15500 1727096202.14596: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.14720: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15500 1727096202.14908: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.15144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15500 1727096202.15280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15500 1727096202.15283: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419017b00> # zipimport: zlib available <<< 15500 1727096202.15365: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.15426: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15500 1727096202.15441: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15500 1727096202.15574: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.15597: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 15500 1727096202.15602: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.15640: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.15701: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.15785: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15500 1727096202.15809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.15924: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419022420> <<< 15500 1727096202.15927: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341901dd30> <<< 15500 1727096202.15953: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 15500 1727096202.15957: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15500 1727096202.16039: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.16126: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.16174: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 15500 1727096202.16185: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15500 1727096202.16225: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15500 1727096202.16876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341910ac60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419886930> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190224b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419185b80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.16880: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.16883: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.16914: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.16946: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.16986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15500 1727096202.17041: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.17110: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.17190: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.17197: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15500 1727096202.17379: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.17561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.17596: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.17706: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.17709: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15500 1727096202.17740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15500 1727096202.17763: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b2990> <<< 15500 1727096202.17783: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15500 1727096202.17818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15500 1727096202.17846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15500 1727096202.18001: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15500 1727096202.18031: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d203e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d207d0> <<< 15500 1727096202.18062: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341909c500> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b3530> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b1070> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b0cb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15500 1727096202.18156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15500 1727096202.18270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15500 1727096202.18274: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d23680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d22f30> <<< 15500 1727096202.18277: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d23110> <<< 15500 1727096202.18279: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d22360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15500 1727096202.18400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15500 1727096202.18495: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d23770> <<< 15500 1727096202.18499: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d6e270> <<< 15500 1727096202.18577: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d6c290> <<< 15500 1727096202.18584: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b0da0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 15500 1727096202.18684: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 15500 1727096202.18713: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 15500 1727096202.18728: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.18848: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.18935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.19022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 15500 1727096202.19051: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 15500 1727096202.19316: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 15500 1727096202.19477: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15500 1727096202.19915: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 15500 1727096202.20340: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20402: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20421: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20479: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 15500 1727096202.20521: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20544: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 15500 1727096202.20602: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 15500 1727096202.20802: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # <<< 15500 1727096202.20805: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20884: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15500 1727096202.20892: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.20973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15500 1727096202.21021: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d6f980> <<< 15500 1727096202.21126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15500 1727096202.21132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15500 1727096202.21179: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d6f020> import 'ansible.module_utils.facts.system.local' # <<< 15500 1727096202.21196: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.21265: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.21326: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 15500 1727096202.21437: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.21506: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 15500 1727096202.21577: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.21682: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15500 1727096202.21794: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.21860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15500 1727096202.21875: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096202.21926: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418dae660> <<< 15500 1727096202.22412: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d9f260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 15500 1727096202.22418: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.22518: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.22658: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 15500 1727096202.22680: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.22702: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.22758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 15500 1727096202.22835: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.22869: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15500 1727096202.22959: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418dc2210> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d9f650> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 15500 1727096202.22997: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15500 1727096202.23081: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23195: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23365: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15500 1727096202.23453: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23551: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23598: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15500 1727096202.23670: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 15500 1727096202.23687: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23705: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.23911: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.24029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15500 1727096202.24049: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.24105: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.24220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15500 1727096202.24244: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.24276: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.24303: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.24905: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.25370: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 15500 1727096202.25447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15500 1727096202.25488: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.25581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15500 1727096202.25687: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.25701: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.25796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15500 1727096202.25812: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.25941: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.26196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 15500 1727096202.26204: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15500 1727096202.26237: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15500 1727096202.26331: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.26426: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.26627: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.26823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15500 1727096202.26956: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 15500 1727096202.27018: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.27022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15500 1727096202.27200: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 15500 1727096202.27241: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.27306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 15500 1727096202.27423: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 15500 1727096202.27682: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.27939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 15500 1727096202.28005: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15500 1727096202.28218: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28222: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 15500 1727096202.28240: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28265: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 15500 1727096202.28394: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15500 1727096202.28545: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28712: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 15500 1727096202.28716: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.28743: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096202.28794: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28876: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 15500 1727096202.28890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15500 1727096202.28947: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.28989: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 15500 1727096202.29148: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29185: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29386: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 15500 1727096202.29429: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 15500 1727096202.29501: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29529: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15500 1727096202.29706: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29843: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15500 1727096202.29846: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29856: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.29935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 15500 1727096202.29960: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15500 1727096202.30017: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096202.30965: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15500 1727096202.31019: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15500 1727096202.31222: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418bbfa40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418bbc1a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418bbc1d0> <<< 15500 1727096202.31508: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "42", "epoch": "1727096202", "epoch_int": "1727096202", "date": "2024-09-23", "time": "08:56:42", "iso8601_micro": "2024-09-23T12:56:42.304035Z", "iso8601": "2024-09-23T12:56:42Z", "iso8601_basic": "20240923T085642304035", "iso8601_basic_short": "20240923T085642", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilit<<< 15500 1727096202.31540: stdout chunk (state=3): >>>ies_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096202.32218: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 15500 1727096202.32256: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader <<< 15500 1727096202.32302: stdout chunk (state=3): >>># cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters <<< 15500 1727096202.32425: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool <<< 15500 1727096202.32475: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15500 1727096202.32897: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15500 1727096202.32901: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 15500 1727096202.33017: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15500 1727096202.33198: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 15500 1727096202.33202: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 15500 1727096202.33205: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors <<< 15500 1727096202.33372: stdout chunk (state=3): >>># destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 15500 1727096202.33419: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 15500 1727096202.33459: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 15500 1727096202.33577: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 15500 1727096202.33782: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 15500 1727096202.33812: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15500 1727096202.33847: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15500 1727096202.33998: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 15500 1727096202.34038: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 15500 1727096202.34055: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 15500 1727096202.34081: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 15500 1727096202.34162: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 15500 1727096202.34165: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 15500 1727096202.34169: stdout chunk (state=3): >>># clear sys.audit hooks <<< 15500 1727096202.34629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096202.34633: stdout chunk (state=3): >>><<< 15500 1727096202.34753: stderr chunk (state=3): >>><<< 15500 1727096202.34998: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419d104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419cdfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419d12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b05130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b05fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b43ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b43f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b7b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b7bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b5bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b592b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b41070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b9b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b9a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b5a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b98bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b402f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419bd0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419bd0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419b3ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419be8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419be9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419beacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419beb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bea210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419bebd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419beb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34198e7c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34199107a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419910500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34199107d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419911100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419911af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199109b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34198e5df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419912f00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419911c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419bd2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341993b230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341995f5f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199c0380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199c2ae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199c04a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419981370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34197c1430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341995e3f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419913e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f341995e750> # zipimport: found 103 names in '/tmp/ansible_setup_payload_4aerw5pr/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341982b170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341980a060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34198091c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419829040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341985ab10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985a8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985a1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985aba0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341982be00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341985b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341985b9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341985bef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419129c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341912b830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912c230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912fe60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419912e70> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912e120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419137d10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34191367e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419136540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419136ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341912e630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341917bf50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f341917db50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917d910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419180110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917e240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34191838c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419180290> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419184980> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419184920> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419184bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341917c230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f34190102f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419011880> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419186a80> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419187e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34191866c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419015a90> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190167e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34199119d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419016810> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419017b00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3419022420> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341901dd30> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341910ac60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419886930> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190224b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3419185b80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b2990> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d203e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d207d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f341909c500> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b3530> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b1070> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b0cb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d23680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d22f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d23110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d22360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d23770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418d6e270> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d6c290> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f34190b0da0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d6f980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d6f020> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418dae660> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d9f260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418dc2210> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418d9f650> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3418bbfa40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418bbc1a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3418bbc1d0> {"ansible_facts": {"ansible_fips": false, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "42", "epoch": "1727096202", "epoch_int": "1727096202", "date": "2024-09-23", "time": "08:56:42", "iso8601_micro": "2024-09-23T12:56:42.304035Z", "iso8601": "2024-09-23T12:56:42Z", "iso8601_basic": "20240923T085642304035", "iso8601_basic_short": "20240923T085642", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15500 1727096202.36732: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096202.36736: _low_level_execute_command(): starting 15500 1727096202.36739: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096201.8206844-15580-176973298512767/ > /dev/null 2>&1 && sleep 0' 15500 1727096202.36741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096202.37364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096202.37370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096202.37373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096202.37375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096202.39574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096202.39579: stderr chunk (state=3): >>><<< 15500 1727096202.39581: stdout chunk (state=3): >>><<< 15500 1727096202.39584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096202.39586: handler run complete 15500 1727096202.39620: variable 'ansible_facts' from source: unknown 15500 1727096202.39678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096202.39814: variable 'ansible_facts' from source: unknown 15500 1727096202.39862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096202.39928: attempt loop complete, returning result 15500 1727096202.39931: _execute() done 15500 1727096202.39934: dumping result to json 15500 1727096202.39945: done dumping result, returning 15500 1727096202.39954: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-877d-2da0-00000000008f] 15500 1727096202.39957: sending task result for task 0afff68d-5257-877d-2da0-00000000008f 15500 1727096202.40146: done sending task result for task 0afff68d-5257-877d-2da0-00000000008f 15500 1727096202.40150: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096202.40273: no more pending results, returning what we have 15500 1727096202.40277: results queue empty 15500 1727096202.40278: checking for any_errors_fatal 15500 1727096202.40280: done checking for any_errors_fatal 15500 1727096202.40280: checking for max_fail_percentage 15500 1727096202.40282: done checking for max_fail_percentage 15500 1727096202.40283: checking to see if all hosts have failed and the running result is not ok 15500 1727096202.40284: done checking to see if all hosts have failed 15500 1727096202.40284: getting the remaining hosts for this loop 15500 1727096202.40286: done getting the remaining hosts for this loop 15500 1727096202.40290: getting the next task for host managed_node1 15500 1727096202.40303: done getting next task for host managed_node1 15500 1727096202.40306: ^ task is: TASK: Check if system is ostree 15500 1727096202.40309: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096202.40313: getting variables 15500 1727096202.40315: in VariableManager get_vars() 15500 1727096202.40586: Calling all_inventory to load vars for managed_node1 15500 1727096202.40592: Calling groups_inventory to load vars for managed_node1 15500 1727096202.40595: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096202.40621: Calling all_plugins_play to load vars for managed_node1 15500 1727096202.40624: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096202.40628: Calling groups_plugins_play to load vars for managed_node1 15500 1727096202.40938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096202.41343: done with get_vars() 15500 1727096202.41355: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:56:42 -0400 (0:00:00.685) 0:00:02.457 ****** 15500 1727096202.41475: entering _queue_task() for managed_node1/stat 15500 1727096202.41937: worker is 1 (out of 1 available) 15500 1727096202.41951: exiting _queue_task() for managed_node1/stat 15500 1727096202.42021: done queuing things up, now waiting for results queue to drain 15500 1727096202.42023: waiting for pending results... 15500 1727096202.42265: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 15500 1727096202.42361: in run() - task 0afff68d-5257-877d-2da0-000000000091 15500 1727096202.42391: variable 'ansible_search_path' from source: unknown 15500 1727096202.42415: variable 'ansible_search_path' from source: unknown 15500 1727096202.42467: calling self._execute() 15500 1727096202.42576: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096202.42580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096202.42583: variable 'omit' from source: magic vars 15500 1727096202.43971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096202.44884: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096202.45010: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096202.45175: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096202.45200: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096202.45424: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096202.45499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096202.45574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096202.45773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096202.45877: Evaluated conditional (not __network_is_ostree is defined): True 15500 1727096202.46074: variable 'omit' from source: magic vars 15500 1727096202.46079: variable 'omit' from source: magic vars 15500 1727096202.46160: variable 'omit' from source: magic vars 15500 1727096202.46227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096202.46257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096202.46283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096202.46302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096202.46330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096202.46358: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096202.46373: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096202.46383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096202.46507: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096202.46520: Set connection var ansible_pipelining to False 15500 1727096202.46546: Set connection var ansible_timeout to 10 15500 1727096202.46555: Set connection var ansible_shell_type to sh 15500 1727096202.46569: Set connection var ansible_shell_executable to /bin/sh 15500 1727096202.46647: Set connection var ansible_connection to ssh 15500 1727096202.46650: variable 'ansible_shell_executable' from source: unknown 15500 1727096202.46652: variable 'ansible_connection' from source: unknown 15500 1727096202.46655: variable 'ansible_module_compression' from source: unknown 15500 1727096202.46657: variable 'ansible_shell_type' from source: unknown 15500 1727096202.46659: variable 'ansible_shell_executable' from source: unknown 15500 1727096202.46661: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096202.46663: variable 'ansible_pipelining' from source: unknown 15500 1727096202.46665: variable 'ansible_timeout' from source: unknown 15500 1727096202.46669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096202.46827: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096202.46982: variable 'omit' from source: magic vars 15500 1727096202.46992: starting attempt loop 15500 1727096202.47083: running the handler 15500 1727096202.47087: _low_level_execute_command(): starting 15500 1727096202.47090: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096202.49144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096202.49150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096202.49283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096202.49376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096202.49498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096202.49600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096202.51913: stdout chunk (state=3): >>>/root <<< 15500 1727096202.52277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096202.52281: stdout chunk (state=3): >>><<< 15500 1727096202.52284: stderr chunk (state=3): >>><<< 15500 1727096202.52287: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15500 1727096202.52584: _low_level_execute_command(): starting 15500 1727096202.52588: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928 `" && echo ansible-tmp-1727096202.522663-15607-2677544041928="` echo /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928 `" ) && sleep 0' 15500 1727096202.53137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096202.53153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096202.53172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096202.53192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096202.53210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096202.53223: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096202.53238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096202.53258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096202.53274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096202.53287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15500 1727096202.53301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096202.53315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096202.53398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096202.53422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096202.53527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096202.56281: stdout chunk (state=3): >>>ansible-tmp-1727096202.522663-15607-2677544041928=/root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928 <<< 15500 1727096202.56479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096202.56496: stdout chunk (state=3): >>><<< 15500 1727096202.56513: stderr chunk (state=3): >>><<< 15500 1727096202.56536: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096202.522663-15607-2677544041928=/root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15500 1727096202.56601: variable 'ansible_module_compression' from source: unknown 15500 1727096202.56671: ANSIBALLZ: Using lock for stat 15500 1727096202.56679: ANSIBALLZ: Acquiring lock 15500 1727096202.56685: ANSIBALLZ: Lock acquired: 140712178849392 15500 1727096202.56692: ANSIBALLZ: Creating module 15500 1727096202.84348: ANSIBALLZ: Writing module into payload 15500 1727096202.84637: ANSIBALLZ: Writing module 15500 1727096202.84640: ANSIBALLZ: Renaming module 15500 1727096202.84642: ANSIBALLZ: Done creating module 15500 1727096202.84644: variable 'ansible_facts' from source: unknown 15500 1727096202.84798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/AnsiballZ_stat.py 15500 1727096202.85120: Sending initial data 15500 1727096202.85130: Sent initial data (150 bytes) 15500 1727096202.86565: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096202.86759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096202.86863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096202.86900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096202.86973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096202.88678: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096202.88734: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096202.88841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpvlfp8cyj /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/AnsiballZ_stat.py <<< 15500 1727096202.88857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/AnsiballZ_stat.py" <<< 15500 1727096202.88898: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpvlfp8cyj" to remote "/root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/AnsiballZ_stat.py" <<< 15500 1727096202.90574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096202.90578: stdout chunk (state=3): >>><<< 15500 1727096202.90581: stderr chunk (state=3): >>><<< 15500 1727096202.90604: done transferring module to remote 15500 1727096202.90626: _low_level_execute_command(): starting 15500 1727096202.90638: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/ /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/AnsiballZ_stat.py && sleep 0' 15500 1727096202.92389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096202.92489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096202.92592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096202.92808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096202.94665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096202.94976: stderr chunk (state=3): >>><<< 15500 1727096202.94981: stdout chunk (state=3): >>><<< 15500 1727096202.94984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096202.94987: _low_level_execute_command(): starting 15500 1727096202.94989: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/AnsiballZ_stat.py && sleep 0' 15500 1727096202.95449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096202.95559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096202.95563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096202.95566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096202.95571: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096202.95573: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096202.95575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096202.95578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096202.95580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096202.95582: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15500 1727096202.95584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096202.95586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096202.95588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096202.95590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096202.95592: stderr chunk (state=3): >>>debug2: match found <<< 15500 1727096202.95615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096202.95774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096202.95778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096202.95838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096202.98077: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15500 1727096202.98121: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # <<< 15500 1727096202.98166: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # <<< 15500 1727096202.98203: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15500 1727096202.98216: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 15500 1727096202.98272: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.98319: stdout chunk (state=3): >>>import '_codecs' # <<< 15500 1727096202.98381: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003d684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003d37b30> <<< 15500 1727096202.98446: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003d6aa50> import '_signal' # <<< 15500 1727096202.98507: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # <<< 15500 1727096202.98561: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15500 1727096202.98592: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15500 1727096202.98711: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15500 1727096202.98717: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15500 1727096202.98745: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15500 1727096202.98761: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b1d130> <<< 15500 1727096202.98829: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.98833: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b1dfa0> <<< 15500 1727096202.98889: stdout chunk (state=3): >>>import 'site' # <<< 15500 1727096202.98898: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15500 1727096202.99151: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15500 1727096202.99154: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15500 1727096202.99175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15500 1727096202.99223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15500 1727096202.99259: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15500 1727096202.99264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b5bec0> <<< 15500 1727096202.99334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15500 1727096202.99338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15500 1727096202.99340: stdout chunk (state=3): >>>import '_operator' # <<< 15500 1727096202.99441: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b5bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15500 1727096202.99444: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15500 1727096202.99447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096202.99517: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b93830> <<< 15500 1727096202.99520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15500 1727096202.99577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b93ec0> import '_collections' # <<< 15500 1727096202.99627: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b73b60> <<< 15500 1727096202.99649: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b712b0> <<< 15500 1727096202.99765: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b59070> <<< 15500 1727096202.99772: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15500 1727096202.99775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15500 1727096202.99777: stdout chunk (state=3): >>>import '_sre' # <<< 15500 1727096202.99873: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15500 1727096202.99896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bb37d0> <<< 15500 1727096202.99915: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bb23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b72150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bb0bc0> <<< 15500 1727096202.99979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be8890><<< 15500 1727096203.00060: stdout chunk (state=3): >>> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b582f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15500 1727096203.00071: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003be8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.00181: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003be8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b56e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096203.00192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be9370> import 'importlib.machinery' # <<< 15500 1727096203.00552: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bea540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15500 1727096203.00560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 15500 1727096203.00564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c00740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003c01e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c02cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003c032f0> <<< 15500 1727096203.00573: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c02210> <<< 15500 1727096203.00639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.01020: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003c03d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c034a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bea4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500398bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b4740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.01065: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b5070> <<< 15500 1727096203.01274: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.01337: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b5a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b4920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003989df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b6e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b5b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003beac60> <<< 15500 1727096203.01360: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15500 1727096203.01429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096203.01433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15500 1727096203.01475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15500 1727096203.01492: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039db170> <<< 15500 1727096203.01548: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15500 1727096203.01571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096203.01595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15500 1727096203.01673: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a03500> <<< 15500 1727096203.01676: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15500 1727096203.01707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15500 1727096203.01773: stdout chunk (state=3): >>>import 'ntpath' # <<< 15500 1727096203.01972: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a64260> <<< 15500 1727096203.01977: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15500 1727096203.01980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15500 1727096203.01982: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a669c0> <<< 15500 1727096203.02050: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a64380> <<< 15500 1727096203.02089: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a29250> <<< 15500 1727096203.02284: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003329340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a02330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b7d70> <<< 15500 1727096203.02288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15500 1727096203.02290: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f50033295e0> <<< 15500 1727096203.02436: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_t0x5v8k3/ansible_stat_payload.zip' # zipimport: zlib available <<< 15500 1727096203.02562: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.02598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15500 1727096203.02672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15500 1727096203.02713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15500 1727096203.02763: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500337f0b0> import '_typing' # <<< 15500 1727096203.03013: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500335dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500335d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096203.03023: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 15500 1727096203.03046: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.04472: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.05613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500337cf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15500 1727096203.05977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50033a6a20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a67b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a60c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a6510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500337fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50033a77d0> <<< 15500 1727096203.05990: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50033a79e0> <<< 15500 1727096203.05993: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15500 1727096203.05995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15500 1727096203.06110: stdout chunk (state=3): >>>import '_locale' # <<< 15500 1727096203.06114: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a7ef0> import 'pwd' # <<< 15500 1727096203.06116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15500 1727096203.06119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15500 1727096203.06154: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003211ca0> <<< 15500 1727096203.06244: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50032138c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15500 1727096203.06261: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032142c0> <<< 15500 1727096203.06276: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15500 1727096203.06306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15500 1727096203.06462: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003215460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003217ec0> <<< 15500 1727096203.06493: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.06573: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500337c7d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003216180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15500 1727096203.06614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15500 1727096203.06617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15500 1727096203.06655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15500 1727096203.06765: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321ff20> <<< 15500 1727096203.06773: stdout chunk (state=3): >>>import '_tokenize' # <<< 15500 1727096203.06788: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321e750> <<< 15500 1727096203.06803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15500 1727096203.06881: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321ecc0> <<< 15500 1727096203.06914: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003216690> <<< 15500 1727096203.07003: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003267fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15500 1727096203.07028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15500 1727096203.07061: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.07080: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003269d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003269b20> <<< 15500 1727096203.07097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15500 1727096203.07221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15500 1727096203.07261: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.07336: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500326c290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500326a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096203.07363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15500 1727096203.07420: stdout chunk (state=3): >>>import '_string' # <<< 15500 1727096203.07429: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500326fa10> <<< 15500 1727096203.07550: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500326c3e0> <<< 15500 1727096203.07612: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.07659: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003270830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003270890> <<< 15500 1727096203.07690: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003270b90> <<< 15500 1727096203.07855: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032684d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15500 1727096203.07861: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.07863: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50032fc290> <<< 15500 1727096203.07982: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.08009: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50032fd250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003272a20> <<< 15500 1727096203.08030: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003273dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003272630> <<< 15500 1727096203.08090: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15500 1727096203.08273: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.08311: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096203.08336: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15500 1727096203.08525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.08580: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.09137: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.09674: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15500 1727096203.09689: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15500 1727096203.09716: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15500 1727096203.09737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096203.09797: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 15500 1727096203.09864: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003101520> <<< 15500 1727096203.09895: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15500 1727096203.09923: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003102240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032fd430> <<< 15500 1727096203.09984: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15500 1727096203.09988: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096203.10014: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 15500 1727096203.10034: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.10174: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.10348: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15500 1727096203.10351: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003102390> <<< 15500 1727096203.10372: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.10809: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.11264: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.11556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 15500 1727096203.11561: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.11665: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.11901: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096203.11929: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15500 1727096203.11961: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.12353: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.12736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15500 1727096203.12831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15500 1727096203.12862: stdout chunk (state=3): >>>import '_ast' # <<< 15500 1727096203.12970: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003103530> <<< 15500 1727096203.13002: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.13117: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.13226: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15500 1727096203.13292: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 15500 1727096203.13347: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.13408: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15500 1727096203.13440: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.13506: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.13577: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.13677: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.13747: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15500 1727096203.13804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15500 1727096203.13900: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500310e090> <<< 15500 1727096203.13938: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003108e60> <<< 15500 1727096203.13990: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15500 1727096203.14095: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.14194: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.14381: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15500 1727096203.14495: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033fa9f0> <<< 15500 1727096203.14518: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033ee6c0> <<< 15500 1727096203.14618: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500310e1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032fd490> <<< 15500 1727096203.14664: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 15500 1727096203.14696: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.14719: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15500 1727096203.15003: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 15500 1727096203.15162: stdout chunk (state=3): >>># zipimport: zlib available <<< 15500 1727096203.15350: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 15500 1727096203.15373: stdout chunk (state=3): >>># destroy __main__ <<< 15500 1727096203.15653: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 15500 1727096203.15762: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 15500 1727096203.15781: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc <<< 15500 1727096203.15825: stdout chunk (state=3): >>># cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings <<< 15500 1727096203.15869: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15500 1727096203.16260: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 15500 1727096203.16263: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 15500 1727096203.16286: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux # destroy shutil <<< 15500 1727096203.16495: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback<<< 15500 1727096203.16508: stdout chunk (state=3): >>> # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15500 1727096203.16660: stdout chunk (state=3): >>># destroy sys.monitoring<<< 15500 1727096203.16666: stdout chunk (state=3): >>> <<< 15500 1727096203.16684: stdout chunk (state=3): >>># destroy _socket <<< 15500 1727096203.16714: stdout chunk (state=3): >>># destroy _collections<<< 15500 1727096203.16720: stdout chunk (state=3): >>> <<< 15500 1727096203.16754: stdout chunk (state=3): >>># destroy platform<<< 15500 1727096203.16762: stdout chunk (state=3): >>> <<< 15500 1727096203.16785: stdout chunk (state=3): >>># destroy _uuid <<< 15500 1727096203.16788: stdout chunk (state=3): >>># destroy stat # destroy genericpath <<< 15500 1727096203.16838: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 15500 1727096203.16851: stdout chunk (state=3): >>># destroy copyreg<<< 15500 1727096203.16890: stdout chunk (state=3): >>> # destroy contextlib <<< 15500 1727096203.16915: stdout chunk (state=3): >>># destroy _typing <<< 15500 1727096203.16925: stdout chunk (state=3): >>># destroy _tokenize<<< 15500 1727096203.16938: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser<<< 15500 1727096203.16965: stdout chunk (state=3): >>> # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external<<< 15500 1727096203.16988: stdout chunk (state=3): >>> # destroy _imp # destroy _io # destroy marshal <<< 15500 1727096203.17095: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15500 1727096203.17133: stdout chunk (state=3): >>># destroy codecs<<< 15500 1727096203.17138: stdout chunk (state=3): >>> <<< 15500 1727096203.17170: stdout chunk (state=3): >>># destroy encodings.aliases <<< 15500 1727096203.17176: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig<<< 15500 1727096203.17195: stdout chunk (state=3): >>> # destroy encodings.cp437<<< 15500 1727096203.17206: stdout chunk (state=3): >>> <<< 15500 1727096203.17214: stdout chunk (state=3): >>># destroy _codecs<<< 15500 1727096203.17235: stdout chunk (state=3): >>> # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading<<< 15500 1727096203.17261: stdout chunk (state=3): >>> # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 15500 1727096203.17289: stdout chunk (state=3): >>> <<< 15500 1727096203.17304: stdout chunk (state=3): >>># destroy _random<<< 15500 1727096203.17344: stdout chunk (state=3): >>> # destroy _weakref # destroy _hashlib <<< 15500 1727096203.17372: stdout chunk (state=3): >>># destroy _operator # destroy _string<<< 15500 1727096203.17401: stdout chunk (state=3): >>> # destroy re # destroy itertools<<< 15500 1727096203.17408: stdout chunk (state=3): >>> <<< 15500 1727096203.17421: stdout chunk (state=3): >>># destroy _abc<<< 15500 1727096203.17444: stdout chunk (state=3): >>> # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 15500 1727096203.17462: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 15500 1727096203.17492: stdout chunk (state=3): >>> <<< 15500 1727096203.17991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096203.17994: stdout chunk (state=3): >>><<< 15500 1727096203.17998: stderr chunk (state=3): >>><<< 15500 1727096203.18105: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003d684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003d37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003d6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b1d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b1dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b5bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b5bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b93830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b93ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b73b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b712b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b59070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bb37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bb23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b72150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bb0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b582f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003be8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003be8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003b56e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003be9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bea540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c00740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003c01e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c02cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003c032f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c02210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003c03d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003c034a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003bea4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500398bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b4740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b5070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50039b5a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b4920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003989df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b6e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b5b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003beac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039db170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a03500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a64260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a669c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a64380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a29250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003329340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003a02330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50039b7d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f50033295e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_t0x5v8k3/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500337f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500335dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500335d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500337cf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50033a6a20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a67b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a60c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a6510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500337fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50033a77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50033a79e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033a7ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003211ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50032138c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032142c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003215460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003217ec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500337c7d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003216180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321ff20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321e750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500321ecc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003216690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003267fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003269d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003269b20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500326c290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500326a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500326fa10> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500326c3e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003270830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003270890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003270b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032684d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50032fc290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f50032fd250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003272a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003273dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003272630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5003101520> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003102240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032fd430> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003102390> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003103530> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f500310e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5003108e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033fa9f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50033ee6c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f500310e1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f50032fd490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15500 1727096203.19055: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096203.19059: _low_level_execute_command(): starting 15500 1727096203.19061: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096202.522663-15607-2677544041928/ > /dev/null 2>&1 && sleep 0' 15500 1727096203.19428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096203.19443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096203.19491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.19575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096203.19609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096203.19658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096203.19725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096203.22339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096203.22343: stdout chunk (state=3): >>><<< 15500 1727096203.22346: stderr chunk (state=3): >>><<< 15500 1727096203.22445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096203.22449: handler run complete 15500 1727096203.22452: attempt loop complete, returning result 15500 1727096203.22454: _execute() done 15500 1727096203.22458: dumping result to json 15500 1727096203.22461: done dumping result, returning 15500 1727096203.22462: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0afff68d-5257-877d-2da0-000000000091] 15500 1727096203.22464: sending task result for task 0afff68d-5257-877d-2da0-000000000091 15500 1727096203.22525: done sending task result for task 0afff68d-5257-877d-2da0-000000000091 15500 1727096203.22529: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15500 1727096203.22609: no more pending results, returning what we have 15500 1727096203.22613: results queue empty 15500 1727096203.22614: checking for any_errors_fatal 15500 1727096203.22620: done checking for any_errors_fatal 15500 1727096203.22621: checking for max_fail_percentage 15500 1727096203.22622: done checking for max_fail_percentage 15500 1727096203.22623: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.22624: done checking to see if all hosts have failed 15500 1727096203.22624: getting the remaining hosts for this loop 15500 1727096203.22626: done getting the remaining hosts for this loop 15500 1727096203.22629: getting the next task for host managed_node1 15500 1727096203.22636: done getting next task for host managed_node1 15500 1727096203.22639: ^ task is: TASK: Set flag to indicate system is ostree 15500 1727096203.22642: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.22645: getting variables 15500 1727096203.22647: in VariableManager get_vars() 15500 1727096203.22785: Calling all_inventory to load vars for managed_node1 15500 1727096203.22788: Calling groups_inventory to load vars for managed_node1 15500 1727096203.22792: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.22802: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.22805: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.22808: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.23072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.23226: done with get_vars() 15500 1727096203.23234: done getting variables 15500 1727096203.23308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:56:43 -0400 (0:00:00.818) 0:00:03.276 ****** 15500 1727096203.23330: entering _queue_task() for managed_node1/set_fact 15500 1727096203.23331: Creating lock for set_fact 15500 1727096203.23554: worker is 1 (out of 1 available) 15500 1727096203.23570: exiting _queue_task() for managed_node1/set_fact 15500 1727096203.23584: done queuing things up, now waiting for results queue to drain 15500 1727096203.23585: waiting for pending results... 15500 1727096203.23724: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 15500 1727096203.23778: in run() - task 0afff68d-5257-877d-2da0-000000000092 15500 1727096203.23788: variable 'ansible_search_path' from source: unknown 15500 1727096203.23792: variable 'ansible_search_path' from source: unknown 15500 1727096203.23820: calling self._execute() 15500 1727096203.23877: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.23881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.23890: variable 'omit' from source: magic vars 15500 1727096203.24218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096203.24437: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096203.24470: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096203.24497: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096203.24521: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096203.24589: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096203.24605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096203.24623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096203.24641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096203.24732: Evaluated conditional (not __network_is_ostree is defined): True 15500 1727096203.24735: variable 'omit' from source: magic vars 15500 1727096203.24794: variable 'omit' from source: magic vars 15500 1727096203.24845: variable '__ostree_booted_stat' from source: set_fact 15500 1727096203.24884: variable 'omit' from source: magic vars 15500 1727096203.24904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096203.24927: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096203.24941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096203.24953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096203.24962: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096203.24985: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096203.24988: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.24991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.25062: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096203.25066: Set connection var ansible_pipelining to False 15500 1727096203.25071: Set connection var ansible_timeout to 10 15500 1727096203.25074: Set connection var ansible_shell_type to sh 15500 1727096203.25079: Set connection var ansible_shell_executable to /bin/sh 15500 1727096203.25084: Set connection var ansible_connection to ssh 15500 1727096203.25100: variable 'ansible_shell_executable' from source: unknown 15500 1727096203.25103: variable 'ansible_connection' from source: unknown 15500 1727096203.25105: variable 'ansible_module_compression' from source: unknown 15500 1727096203.25108: variable 'ansible_shell_type' from source: unknown 15500 1727096203.25110: variable 'ansible_shell_executable' from source: unknown 15500 1727096203.25114: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.25116: variable 'ansible_pipelining' from source: unknown 15500 1727096203.25118: variable 'ansible_timeout' from source: unknown 15500 1727096203.25132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.25195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096203.25203: variable 'omit' from source: magic vars 15500 1727096203.25208: starting attempt loop 15500 1727096203.25211: running the handler 15500 1727096203.25218: handler run complete 15500 1727096203.25227: attempt loop complete, returning result 15500 1727096203.25230: _execute() done 15500 1727096203.25234: dumping result to json 15500 1727096203.25236: done dumping result, returning 15500 1727096203.25239: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0afff68d-5257-877d-2da0-000000000092] 15500 1727096203.25251: sending task result for task 0afff68d-5257-877d-2da0-000000000092 15500 1727096203.25321: done sending task result for task 0afff68d-5257-877d-2da0-000000000092 15500 1727096203.25324: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15500 1727096203.25402: no more pending results, returning what we have 15500 1727096203.25404: results queue empty 15500 1727096203.25405: checking for any_errors_fatal 15500 1727096203.25411: done checking for any_errors_fatal 15500 1727096203.25411: checking for max_fail_percentage 15500 1727096203.25413: done checking for max_fail_percentage 15500 1727096203.25413: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.25414: done checking to see if all hosts have failed 15500 1727096203.25415: getting the remaining hosts for this loop 15500 1727096203.25416: done getting the remaining hosts for this loop 15500 1727096203.25419: getting the next task for host managed_node1 15500 1727096203.25427: done getting next task for host managed_node1 15500 1727096203.25429: ^ task is: TASK: Fix CentOS6 Base repo 15500 1727096203.25432: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.25436: getting variables 15500 1727096203.25437: in VariableManager get_vars() 15500 1727096203.25464: Calling all_inventory to load vars for managed_node1 15500 1727096203.25468: Calling groups_inventory to load vars for managed_node1 15500 1727096203.25471: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.25479: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.25481: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.25490: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.25641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.25752: done with get_vars() 15500 1727096203.25759: done getting variables 15500 1727096203.25843: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:56:43 -0400 (0:00:00.025) 0:00:03.301 ****** 15500 1727096203.25866: entering _queue_task() for managed_node1/copy 15500 1727096203.26059: worker is 1 (out of 1 available) 15500 1727096203.26073: exiting _queue_task() for managed_node1/copy 15500 1727096203.26084: done queuing things up, now waiting for results queue to drain 15500 1727096203.26086: waiting for pending results... 15500 1727096203.26234: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 15500 1727096203.26297: in run() - task 0afff68d-5257-877d-2da0-000000000094 15500 1727096203.26308: variable 'ansible_search_path' from source: unknown 15500 1727096203.26313: variable 'ansible_search_path' from source: unknown 15500 1727096203.26340: calling self._execute() 15500 1727096203.26400: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.26403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.26412: variable 'omit' from source: magic vars 15500 1727096203.26759: variable 'ansible_distribution' from source: facts 15500 1727096203.26777: Evaluated conditional (ansible_distribution == 'CentOS'): True 15500 1727096203.26852: variable 'ansible_distribution_major_version' from source: facts 15500 1727096203.26856: Evaluated conditional (ansible_distribution_major_version == '6'): False 15500 1727096203.26865: when evaluation is False, skipping this task 15500 1727096203.26870: _execute() done 15500 1727096203.26875: dumping result to json 15500 1727096203.26878: done dumping result, returning 15500 1727096203.26881: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0afff68d-5257-877d-2da0-000000000094] 15500 1727096203.26883: sending task result for task 0afff68d-5257-877d-2da0-000000000094 15500 1727096203.26966: done sending task result for task 0afff68d-5257-877d-2da0-000000000094 15500 1727096203.26971: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15500 1727096203.27048: no more pending results, returning what we have 15500 1727096203.27051: results queue empty 15500 1727096203.27052: checking for any_errors_fatal 15500 1727096203.27055: done checking for any_errors_fatal 15500 1727096203.27056: checking for max_fail_percentage 15500 1727096203.27057: done checking for max_fail_percentage 15500 1727096203.27058: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.27059: done checking to see if all hosts have failed 15500 1727096203.27060: getting the remaining hosts for this loop 15500 1727096203.27061: done getting the remaining hosts for this loop 15500 1727096203.27063: getting the next task for host managed_node1 15500 1727096203.27070: done getting next task for host managed_node1 15500 1727096203.27072: ^ task is: TASK: Include the task 'enable_epel.yml' 15500 1727096203.27075: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.27077: getting variables 15500 1727096203.27079: in VariableManager get_vars() 15500 1727096203.27101: Calling all_inventory to load vars for managed_node1 15500 1727096203.27104: Calling groups_inventory to load vars for managed_node1 15500 1727096203.27107: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.27114: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.27117: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.27119: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.27224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.27336: done with get_vars() 15500 1727096203.27342: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:56:43 -0400 (0:00:00.015) 0:00:03.317 ****** 15500 1727096203.27412: entering _queue_task() for managed_node1/include_tasks 15500 1727096203.27594: worker is 1 (out of 1 available) 15500 1727096203.27607: exiting _queue_task() for managed_node1/include_tasks 15500 1727096203.27618: done queuing things up, now waiting for results queue to drain 15500 1727096203.27619: waiting for pending results... 15500 1727096203.27760: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 15500 1727096203.27821: in run() - task 0afff68d-5257-877d-2da0-000000000095 15500 1727096203.27830: variable 'ansible_search_path' from source: unknown 15500 1727096203.27834: variable 'ansible_search_path' from source: unknown 15500 1727096203.27861: calling self._execute() 15500 1727096203.27920: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.27924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.27933: variable 'omit' from source: magic vars 15500 1727096203.28330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096203.29772: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096203.29823: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096203.29849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096203.29882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096203.29903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096203.29959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096203.29986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096203.30006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096203.30030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096203.30040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096203.30129: variable '__network_is_ostree' from source: set_fact 15500 1727096203.30142: Evaluated conditional (not __network_is_ostree | d(false)): True 15500 1727096203.30147: _execute() done 15500 1727096203.30150: dumping result to json 15500 1727096203.30153: done dumping result, returning 15500 1727096203.30162: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-877d-2da0-000000000095] 15500 1727096203.30166: sending task result for task 0afff68d-5257-877d-2da0-000000000095 15500 1727096203.30253: done sending task result for task 0afff68d-5257-877d-2da0-000000000095 15500 1727096203.30256: WORKER PROCESS EXITING 15500 1727096203.30281: no more pending results, returning what we have 15500 1727096203.30287: in VariableManager get_vars() 15500 1727096203.30320: Calling all_inventory to load vars for managed_node1 15500 1727096203.30323: Calling groups_inventory to load vars for managed_node1 15500 1727096203.30326: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.30336: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.30338: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.30341: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.30532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.30648: done with get_vars() 15500 1727096203.30654: variable 'ansible_search_path' from source: unknown 15500 1727096203.30655: variable 'ansible_search_path' from source: unknown 15500 1727096203.30684: we have included files to process 15500 1727096203.30685: generating all_blocks data 15500 1727096203.30687: done generating all_blocks data 15500 1727096203.30692: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15500 1727096203.30694: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15500 1727096203.30696: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15500 1727096203.31159: done processing included file 15500 1727096203.31161: iterating over new_blocks loaded from include file 15500 1727096203.31162: in VariableManager get_vars() 15500 1727096203.31172: done with get_vars() 15500 1727096203.31173: filtering new block on tags 15500 1727096203.31187: done filtering new block on tags 15500 1727096203.31189: in VariableManager get_vars() 15500 1727096203.31195: done with get_vars() 15500 1727096203.31196: filtering new block on tags 15500 1727096203.31202: done filtering new block on tags 15500 1727096203.31203: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 15500 1727096203.31207: extending task lists for all hosts with included blocks 15500 1727096203.31269: done extending task lists 15500 1727096203.31270: done processing included files 15500 1727096203.31270: results queue empty 15500 1727096203.31271: checking for any_errors_fatal 15500 1727096203.31273: done checking for any_errors_fatal 15500 1727096203.31273: checking for max_fail_percentage 15500 1727096203.31274: done checking for max_fail_percentage 15500 1727096203.31274: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.31275: done checking to see if all hosts have failed 15500 1727096203.31276: getting the remaining hosts for this loop 15500 1727096203.31276: done getting the remaining hosts for this loop 15500 1727096203.31278: getting the next task for host managed_node1 15500 1727096203.31281: done getting next task for host managed_node1 15500 1727096203.31282: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15500 1727096203.31284: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.31285: getting variables 15500 1727096203.31286: in VariableManager get_vars() 15500 1727096203.31291: Calling all_inventory to load vars for managed_node1 15500 1727096203.31292: Calling groups_inventory to load vars for managed_node1 15500 1727096203.31294: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.31297: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.31303: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.31305: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.31403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.31512: done with get_vars() 15500 1727096203.31518: done getting variables 15500 1727096203.31567: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15500 1727096203.31700: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:56:43 -0400 (0:00:00.043) 0:00:03.360 ****** 15500 1727096203.31732: entering _queue_task() for managed_node1/command 15500 1727096203.31734: Creating lock for command 15500 1727096203.31964: worker is 1 (out of 1 available) 15500 1727096203.31978: exiting _queue_task() for managed_node1/command 15500 1727096203.31990: done queuing things up, now waiting for results queue to drain 15500 1727096203.31992: waiting for pending results... 15500 1727096203.32141: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 15500 1727096203.32209: in run() - task 0afff68d-5257-877d-2da0-0000000000af 15500 1727096203.32221: variable 'ansible_search_path' from source: unknown 15500 1727096203.32224: variable 'ansible_search_path' from source: unknown 15500 1727096203.32253: calling self._execute() 15500 1727096203.32310: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.32314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.32328: variable 'omit' from source: magic vars 15500 1727096203.32594: variable 'ansible_distribution' from source: facts 15500 1727096203.32603: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15500 1727096203.32692: variable 'ansible_distribution_major_version' from source: facts 15500 1727096203.32696: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15500 1727096203.32699: when evaluation is False, skipping this task 15500 1727096203.32702: _execute() done 15500 1727096203.32705: dumping result to json 15500 1727096203.32707: done dumping result, returning 15500 1727096203.32714: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0afff68d-5257-877d-2da0-0000000000af] 15500 1727096203.32718: sending task result for task 0afff68d-5257-877d-2da0-0000000000af 15500 1727096203.32807: done sending task result for task 0afff68d-5257-877d-2da0-0000000000af 15500 1727096203.32810: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15500 1727096203.32859: no more pending results, returning what we have 15500 1727096203.32862: results queue empty 15500 1727096203.32863: checking for any_errors_fatal 15500 1727096203.32864: done checking for any_errors_fatal 15500 1727096203.32865: checking for max_fail_percentage 15500 1727096203.32866: done checking for max_fail_percentage 15500 1727096203.32867: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.32869: done checking to see if all hosts have failed 15500 1727096203.32870: getting the remaining hosts for this loop 15500 1727096203.32872: done getting the remaining hosts for this loop 15500 1727096203.32875: getting the next task for host managed_node1 15500 1727096203.32881: done getting next task for host managed_node1 15500 1727096203.32884: ^ task is: TASK: Install yum-utils package 15500 1727096203.32887: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.32890: getting variables 15500 1727096203.32891: in VariableManager get_vars() 15500 1727096203.32916: Calling all_inventory to load vars for managed_node1 15500 1727096203.32928: Calling groups_inventory to load vars for managed_node1 15500 1727096203.32931: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.32939: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.32942: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.32944: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.33093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.33208: done with get_vars() 15500 1727096203.33215: done getting variables 15500 1727096203.33290: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:56:43 -0400 (0:00:00.015) 0:00:03.376 ****** 15500 1727096203.33310: entering _queue_task() for managed_node1/package 15500 1727096203.33311: Creating lock for package 15500 1727096203.33506: worker is 1 (out of 1 available) 15500 1727096203.33518: exiting _queue_task() for managed_node1/package 15500 1727096203.33529: done queuing things up, now waiting for results queue to drain 15500 1727096203.33530: waiting for pending results... 15500 1727096203.33672: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 15500 1727096203.33734: in run() - task 0afff68d-5257-877d-2da0-0000000000b0 15500 1727096203.33743: variable 'ansible_search_path' from source: unknown 15500 1727096203.33746: variable 'ansible_search_path' from source: unknown 15500 1727096203.33777: calling self._execute() 15500 1727096203.33831: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.33836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.33845: variable 'omit' from source: magic vars 15500 1727096203.34110: variable 'ansible_distribution' from source: facts 15500 1727096203.34119: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15500 1727096203.34207: variable 'ansible_distribution_major_version' from source: facts 15500 1727096203.34211: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15500 1727096203.34214: when evaluation is False, skipping this task 15500 1727096203.34216: _execute() done 15500 1727096203.34219: dumping result to json 15500 1727096203.34222: done dumping result, returning 15500 1727096203.34229: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0afff68d-5257-877d-2da0-0000000000b0] 15500 1727096203.34234: sending task result for task 0afff68d-5257-877d-2da0-0000000000b0 15500 1727096203.34318: done sending task result for task 0afff68d-5257-877d-2da0-0000000000b0 15500 1727096203.34321: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15500 1727096203.34372: no more pending results, returning what we have 15500 1727096203.34375: results queue empty 15500 1727096203.34375: checking for any_errors_fatal 15500 1727096203.34382: done checking for any_errors_fatal 15500 1727096203.34382: checking for max_fail_percentage 15500 1727096203.34384: done checking for max_fail_percentage 15500 1727096203.34384: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.34385: done checking to see if all hosts have failed 15500 1727096203.34386: getting the remaining hosts for this loop 15500 1727096203.34387: done getting the remaining hosts for this loop 15500 1727096203.34390: getting the next task for host managed_node1 15500 1727096203.34395: done getting next task for host managed_node1 15500 1727096203.34397: ^ task is: TASK: Enable EPEL 7 15500 1727096203.34400: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.34403: getting variables 15500 1727096203.34404: in VariableManager get_vars() 15500 1727096203.34426: Calling all_inventory to load vars for managed_node1 15500 1727096203.34428: Calling groups_inventory to load vars for managed_node1 15500 1727096203.34440: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.34449: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.34451: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.34454: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.34569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.34684: done with get_vars() 15500 1727096203.34691: done getting variables 15500 1727096203.34729: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:56:43 -0400 (0:00:00.014) 0:00:03.390 ****** 15500 1727096203.34748: entering _queue_task() for managed_node1/command 15500 1727096203.34932: worker is 1 (out of 1 available) 15500 1727096203.34943: exiting _queue_task() for managed_node1/command 15500 1727096203.34954: done queuing things up, now waiting for results queue to drain 15500 1727096203.34955: waiting for pending results... 15500 1727096203.35102: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 15500 1727096203.35166: in run() - task 0afff68d-5257-877d-2da0-0000000000b1 15500 1727096203.35181: variable 'ansible_search_path' from source: unknown 15500 1727096203.35185: variable 'ansible_search_path' from source: unknown 15500 1727096203.35208: calling self._execute() 15500 1727096203.35263: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.35268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.35278: variable 'omit' from source: magic vars 15500 1727096203.35589: variable 'ansible_distribution' from source: facts 15500 1727096203.35598: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15500 1727096203.35686: variable 'ansible_distribution_major_version' from source: facts 15500 1727096203.35690: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15500 1727096203.35693: when evaluation is False, skipping this task 15500 1727096203.35696: _execute() done 15500 1727096203.35698: dumping result to json 15500 1727096203.35701: done dumping result, returning 15500 1727096203.35708: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0afff68d-5257-877d-2da0-0000000000b1] 15500 1727096203.35712: sending task result for task 0afff68d-5257-877d-2da0-0000000000b1 15500 1727096203.35790: done sending task result for task 0afff68d-5257-877d-2da0-0000000000b1 15500 1727096203.35793: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15500 1727096203.35863: no more pending results, returning what we have 15500 1727096203.35866: results queue empty 15500 1727096203.35869: checking for any_errors_fatal 15500 1727096203.35874: done checking for any_errors_fatal 15500 1727096203.35875: checking for max_fail_percentage 15500 1727096203.35876: done checking for max_fail_percentage 15500 1727096203.35877: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.35878: done checking to see if all hosts have failed 15500 1727096203.35878: getting the remaining hosts for this loop 15500 1727096203.35880: done getting the remaining hosts for this loop 15500 1727096203.35882: getting the next task for host managed_node1 15500 1727096203.35888: done getting next task for host managed_node1 15500 1727096203.35890: ^ task is: TASK: Enable EPEL 8 15500 1727096203.35893: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.35896: getting variables 15500 1727096203.35897: in VariableManager get_vars() 15500 1727096203.35922: Calling all_inventory to load vars for managed_node1 15500 1727096203.35924: Calling groups_inventory to load vars for managed_node1 15500 1727096203.35927: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.35935: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.35938: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.35941: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.36071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.36185: done with get_vars() 15500 1727096203.36191: done getting variables 15500 1727096203.36229: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:56:43 -0400 (0:00:00.014) 0:00:03.405 ****** 15500 1727096203.36249: entering _queue_task() for managed_node1/command 15500 1727096203.36428: worker is 1 (out of 1 available) 15500 1727096203.36440: exiting _queue_task() for managed_node1/command 15500 1727096203.36451: done queuing things up, now waiting for results queue to drain 15500 1727096203.36453: waiting for pending results... 15500 1727096203.36595: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 15500 1727096203.36661: in run() - task 0afff68d-5257-877d-2da0-0000000000b2 15500 1727096203.36675: variable 'ansible_search_path' from source: unknown 15500 1727096203.36679: variable 'ansible_search_path' from source: unknown 15500 1727096203.36706: calling self._execute() 15500 1727096203.36762: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.36766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.36782: variable 'omit' from source: magic vars 15500 1727096203.37039: variable 'ansible_distribution' from source: facts 15500 1727096203.37051: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15500 1727096203.37136: variable 'ansible_distribution_major_version' from source: facts 15500 1727096203.37140: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15500 1727096203.37142: when evaluation is False, skipping this task 15500 1727096203.37145: _execute() done 15500 1727096203.37148: dumping result to json 15500 1727096203.37153: done dumping result, returning 15500 1727096203.37164: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0afff68d-5257-877d-2da0-0000000000b2] 15500 1727096203.37169: sending task result for task 0afff68d-5257-877d-2da0-0000000000b2 15500 1727096203.37244: done sending task result for task 0afff68d-5257-877d-2da0-0000000000b2 15500 1727096203.37246: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15500 1727096203.37292: no more pending results, returning what we have 15500 1727096203.37296: results queue empty 15500 1727096203.37296: checking for any_errors_fatal 15500 1727096203.37301: done checking for any_errors_fatal 15500 1727096203.37302: checking for max_fail_percentage 15500 1727096203.37304: done checking for max_fail_percentage 15500 1727096203.37304: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.37305: done checking to see if all hosts have failed 15500 1727096203.37306: getting the remaining hosts for this loop 15500 1727096203.37307: done getting the remaining hosts for this loop 15500 1727096203.37310: getting the next task for host managed_node1 15500 1727096203.37318: done getting next task for host managed_node1 15500 1727096203.37320: ^ task is: TASK: Enable EPEL 6 15500 1727096203.37323: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.37326: getting variables 15500 1727096203.37327: in VariableManager get_vars() 15500 1727096203.37352: Calling all_inventory to load vars for managed_node1 15500 1727096203.37354: Calling groups_inventory to load vars for managed_node1 15500 1727096203.37357: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.37366: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.37370: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.37373: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.37497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.37612: done with get_vars() 15500 1727096203.37619: done getting variables 15500 1727096203.37663: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:56:43 -0400 (0:00:00.014) 0:00:03.420 ****** 15500 1727096203.37685: entering _queue_task() for managed_node1/copy 15500 1727096203.37875: worker is 1 (out of 1 available) 15500 1727096203.37888: exiting _queue_task() for managed_node1/copy 15500 1727096203.37900: done queuing things up, now waiting for results queue to drain 15500 1727096203.37902: waiting for pending results... 15500 1727096203.38052: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 15500 1727096203.38130: in run() - task 0afff68d-5257-877d-2da0-0000000000b4 15500 1727096203.38137: variable 'ansible_search_path' from source: unknown 15500 1727096203.38141: variable 'ansible_search_path' from source: unknown 15500 1727096203.38172: calling self._execute() 15500 1727096203.38226: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.38230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.38241: variable 'omit' from source: magic vars 15500 1727096203.38560: variable 'ansible_distribution' from source: facts 15500 1727096203.38572: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15500 1727096203.38648: variable 'ansible_distribution_major_version' from source: facts 15500 1727096203.38652: Evaluated conditional (ansible_distribution_major_version == '6'): False 15500 1727096203.38655: when evaluation is False, skipping this task 15500 1727096203.38657: _execute() done 15500 1727096203.38663: dumping result to json 15500 1727096203.38665: done dumping result, returning 15500 1727096203.38754: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0afff68d-5257-877d-2da0-0000000000b4] 15500 1727096203.38757: sending task result for task 0afff68d-5257-877d-2da0-0000000000b4 15500 1727096203.38821: done sending task result for task 0afff68d-5257-877d-2da0-0000000000b4 15500 1727096203.38824: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15500 1727096203.38858: no more pending results, returning what we have 15500 1727096203.38861: results queue empty 15500 1727096203.38862: checking for any_errors_fatal 15500 1727096203.38866: done checking for any_errors_fatal 15500 1727096203.38867: checking for max_fail_percentage 15500 1727096203.38870: done checking for max_fail_percentage 15500 1727096203.38871: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.38871: done checking to see if all hosts have failed 15500 1727096203.38872: getting the remaining hosts for this loop 15500 1727096203.38873: done getting the remaining hosts for this loop 15500 1727096203.38876: getting the next task for host managed_node1 15500 1727096203.38882: done getting next task for host managed_node1 15500 1727096203.38884: ^ task is: TASK: Set network provider to 'nm' 15500 1727096203.38887: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.38890: getting variables 15500 1727096203.38891: in VariableManager get_vars() 15500 1727096203.38914: Calling all_inventory to load vars for managed_node1 15500 1727096203.38917: Calling groups_inventory to load vars for managed_node1 15500 1727096203.38919: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.38926: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.38927: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.38929: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.39063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.39173: done with get_vars() 15500 1727096203.39179: done getting variables 15500 1727096203.39218: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Monday 23 September 2024 08:56:43 -0400 (0:00:00.015) 0:00:03.435 ****** 15500 1727096203.39237: entering _queue_task() for managed_node1/set_fact 15500 1727096203.39424: worker is 1 (out of 1 available) 15500 1727096203.39437: exiting _queue_task() for managed_node1/set_fact 15500 1727096203.39449: done queuing things up, now waiting for results queue to drain 15500 1727096203.39451: waiting for pending results... 15500 1727096203.39590: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 15500 1727096203.39640: in run() - task 0afff68d-5257-877d-2da0-000000000007 15500 1727096203.39651: variable 'ansible_search_path' from source: unknown 15500 1727096203.39686: calling self._execute() 15500 1727096203.39736: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.39740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.39749: variable 'omit' from source: magic vars 15500 1727096203.39830: variable 'omit' from source: magic vars 15500 1727096203.39851: variable 'omit' from source: magic vars 15500 1727096203.39881: variable 'omit' from source: magic vars 15500 1727096203.39915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096203.39942: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096203.39959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096203.39976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096203.39985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096203.40012: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096203.40015: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.40017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.40091: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096203.40094: Set connection var ansible_pipelining to False 15500 1727096203.40101: Set connection var ansible_timeout to 10 15500 1727096203.40104: Set connection var ansible_shell_type to sh 15500 1727096203.40106: Set connection var ansible_shell_executable to /bin/sh 15500 1727096203.40121: Set connection var ansible_connection to ssh 15500 1727096203.40132: variable 'ansible_shell_executable' from source: unknown 15500 1727096203.40134: variable 'ansible_connection' from source: unknown 15500 1727096203.40137: variable 'ansible_module_compression' from source: unknown 15500 1727096203.40140: variable 'ansible_shell_type' from source: unknown 15500 1727096203.40142: variable 'ansible_shell_executable' from source: unknown 15500 1727096203.40145: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.40149: variable 'ansible_pipelining' from source: unknown 15500 1727096203.40152: variable 'ansible_timeout' from source: unknown 15500 1727096203.40156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.40265: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096203.40274: variable 'omit' from source: magic vars 15500 1727096203.40280: starting attempt loop 15500 1727096203.40282: running the handler 15500 1727096203.40291: handler run complete 15500 1727096203.40300: attempt loop complete, returning result 15500 1727096203.40302: _execute() done 15500 1727096203.40305: dumping result to json 15500 1727096203.40307: done dumping result, returning 15500 1727096203.40314: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0afff68d-5257-877d-2da0-000000000007] 15500 1727096203.40318: sending task result for task 0afff68d-5257-877d-2da0-000000000007 15500 1727096203.40396: done sending task result for task 0afff68d-5257-877d-2da0-000000000007 15500 1727096203.40399: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15500 1727096203.40486: no more pending results, returning what we have 15500 1727096203.40489: results queue empty 15500 1727096203.40490: checking for any_errors_fatal 15500 1727096203.40494: done checking for any_errors_fatal 15500 1727096203.40495: checking for max_fail_percentage 15500 1727096203.40496: done checking for max_fail_percentage 15500 1727096203.40497: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.40498: done checking to see if all hosts have failed 15500 1727096203.40498: getting the remaining hosts for this loop 15500 1727096203.40499: done getting the remaining hosts for this loop 15500 1727096203.40502: getting the next task for host managed_node1 15500 1727096203.40507: done getting next task for host managed_node1 15500 1727096203.40509: ^ task is: TASK: meta (flush_handlers) 15500 1727096203.40510: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.40514: getting variables 15500 1727096203.40515: in VariableManager get_vars() 15500 1727096203.40539: Calling all_inventory to load vars for managed_node1 15500 1727096203.40541: Calling groups_inventory to load vars for managed_node1 15500 1727096203.40544: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.40551: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.40554: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.40559: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.40666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.40923: done with get_vars() 15500 1727096203.40929: done getting variables 15500 1727096203.40977: in VariableManager get_vars() 15500 1727096203.40983: Calling all_inventory to load vars for managed_node1 15500 1727096203.40985: Calling groups_inventory to load vars for managed_node1 15500 1727096203.40986: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.40989: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.40990: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.40992: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.41073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.41181: done with get_vars() 15500 1727096203.41190: done queuing things up, now waiting for results queue to drain 15500 1727096203.41191: results queue empty 15500 1727096203.41191: checking for any_errors_fatal 15500 1727096203.41193: done checking for any_errors_fatal 15500 1727096203.41193: checking for max_fail_percentage 15500 1727096203.41194: done checking for max_fail_percentage 15500 1727096203.41194: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.41195: done checking to see if all hosts have failed 15500 1727096203.41195: getting the remaining hosts for this loop 15500 1727096203.41196: done getting the remaining hosts for this loop 15500 1727096203.41197: getting the next task for host managed_node1 15500 1727096203.41200: done getting next task for host managed_node1 15500 1727096203.41201: ^ task is: TASK: meta (flush_handlers) 15500 1727096203.41201: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.41207: getting variables 15500 1727096203.41208: in VariableManager get_vars() 15500 1727096203.41214: Calling all_inventory to load vars for managed_node1 15500 1727096203.41216: Calling groups_inventory to load vars for managed_node1 15500 1727096203.41218: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.41222: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.41223: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.41225: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.41301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.41418: done with get_vars() 15500 1727096203.41423: done getting variables 15500 1727096203.41458: in VariableManager get_vars() 15500 1727096203.41464: Calling all_inventory to load vars for managed_node1 15500 1727096203.41465: Calling groups_inventory to load vars for managed_node1 15500 1727096203.41468: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.41472: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.41473: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.41475: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.41549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.41659: done with get_vars() 15500 1727096203.41667: done queuing things up, now waiting for results queue to drain 15500 1727096203.41669: results queue empty 15500 1727096203.41670: checking for any_errors_fatal 15500 1727096203.41671: done checking for any_errors_fatal 15500 1727096203.41671: checking for max_fail_percentage 15500 1727096203.41672: done checking for max_fail_percentage 15500 1727096203.41672: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.41673: done checking to see if all hosts have failed 15500 1727096203.41673: getting the remaining hosts for this loop 15500 1727096203.41674: done getting the remaining hosts for this loop 15500 1727096203.41675: getting the next task for host managed_node1 15500 1727096203.41677: done getting next task for host managed_node1 15500 1727096203.41677: ^ task is: None 15500 1727096203.41678: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.41679: done queuing things up, now waiting for results queue to drain 15500 1727096203.41679: results queue empty 15500 1727096203.41680: checking for any_errors_fatal 15500 1727096203.41680: done checking for any_errors_fatal 15500 1727096203.41681: checking for max_fail_percentage 15500 1727096203.41681: done checking for max_fail_percentage 15500 1727096203.41682: checking to see if all hosts have failed and the running result is not ok 15500 1727096203.41682: done checking to see if all hosts have failed 15500 1727096203.41683: getting the next task for host managed_node1 15500 1727096203.41685: done getting next task for host managed_node1 15500 1727096203.41685: ^ task is: None 15500 1727096203.41686: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.41722: in VariableManager get_vars() 15500 1727096203.41732: done with get_vars() 15500 1727096203.41738: in VariableManager get_vars() 15500 1727096203.41744: done with get_vars() 15500 1727096203.41747: variable 'omit' from source: magic vars 15500 1727096203.41773: in VariableManager get_vars() 15500 1727096203.41780: done with get_vars() 15500 1727096203.41793: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 15500 1727096203.41910: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096203.41933: getting the remaining hosts for this loop 15500 1727096203.41934: done getting the remaining hosts for this loop 15500 1727096203.41935: getting the next task for host managed_node1 15500 1727096203.41937: done getting next task for host managed_node1 15500 1727096203.41938: ^ task is: TASK: Gathering Facts 15500 1727096203.41939: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096203.41940: getting variables 15500 1727096203.41941: in VariableManager get_vars() 15500 1727096203.41946: Calling all_inventory to load vars for managed_node1 15500 1727096203.41947: Calling groups_inventory to load vars for managed_node1 15500 1727096203.41949: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096203.41952: Calling all_plugins_play to load vars for managed_node1 15500 1727096203.41963: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096203.41965: Calling groups_plugins_play to load vars for managed_node1 15500 1727096203.42073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096203.42176: done with get_vars() 15500 1727096203.42181: done getting variables 15500 1727096203.42210: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Monday 23 September 2024 08:56:43 -0400 (0:00:00.029) 0:00:03.465 ****** 15500 1727096203.42225: entering _queue_task() for managed_node1/gather_facts 15500 1727096203.42438: worker is 1 (out of 1 available) 15500 1727096203.42450: exiting _queue_task() for managed_node1/gather_facts 15500 1727096203.42462: done queuing things up, now waiting for results queue to drain 15500 1727096203.42464: waiting for pending results... 15500 1727096203.42603: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096203.42662: in run() - task 0afff68d-5257-877d-2da0-0000000000da 15500 1727096203.42673: variable 'ansible_search_path' from source: unknown 15500 1727096203.42703: calling self._execute() 15500 1727096203.42761: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.42765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.42773: variable 'omit' from source: magic vars 15500 1727096203.43042: variable 'ansible_distribution_major_version' from source: facts 15500 1727096203.43052: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096203.43060: variable 'omit' from source: magic vars 15500 1727096203.43077: variable 'omit' from source: magic vars 15500 1727096203.43102: variable 'omit' from source: magic vars 15500 1727096203.43134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096203.43166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096203.43185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096203.43199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096203.43208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096203.43232: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096203.43235: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.43240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.43315: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096203.43318: Set connection var ansible_pipelining to False 15500 1727096203.43324: Set connection var ansible_timeout to 10 15500 1727096203.43326: Set connection var ansible_shell_type to sh 15500 1727096203.43331: Set connection var ansible_shell_executable to /bin/sh 15500 1727096203.43336: Set connection var ansible_connection to ssh 15500 1727096203.43360: variable 'ansible_shell_executable' from source: unknown 15500 1727096203.43363: variable 'ansible_connection' from source: unknown 15500 1727096203.43366: variable 'ansible_module_compression' from source: unknown 15500 1727096203.43371: variable 'ansible_shell_type' from source: unknown 15500 1727096203.43374: variable 'ansible_shell_executable' from source: unknown 15500 1727096203.43376: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096203.43378: variable 'ansible_pipelining' from source: unknown 15500 1727096203.43380: variable 'ansible_timeout' from source: unknown 15500 1727096203.43382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096203.43513: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096203.43521: variable 'omit' from source: magic vars 15500 1727096203.43525: starting attempt loop 15500 1727096203.43527: running the handler 15500 1727096203.43540: variable 'ansible_facts' from source: unknown 15500 1727096203.43554: _low_level_execute_command(): starting 15500 1727096203.43572: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096203.44084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096203.44090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096203.44093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.44144: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096203.44149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096203.44158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096203.44231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096203.46559: stdout chunk (state=3): >>>/root <<< 15500 1727096203.46709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096203.46744: stderr chunk (state=3): >>><<< 15500 1727096203.46747: stdout chunk (state=3): >>><<< 15500 1727096203.46772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096203.46784: _low_level_execute_command(): starting 15500 1727096203.46790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070 `" && echo ansible-tmp-1727096203.4677193-15658-190259452341070="` echo /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070 `" ) && sleep 0' 15500 1727096203.47262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096203.47267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096203.47271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15500 1727096203.47280: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096203.47283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.47332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096203.47337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096203.47339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096203.47407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096203.50119: stdout chunk (state=3): >>>ansible-tmp-1727096203.4677193-15658-190259452341070=/root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070 <<< 15500 1727096203.50271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096203.50303: stderr chunk (state=3): >>><<< 15500 1727096203.50306: stdout chunk (state=3): >>><<< 15500 1727096203.50322: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096203.4677193-15658-190259452341070=/root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096203.50349: variable 'ansible_module_compression' from source: unknown 15500 1727096203.50392: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096203.50443: variable 'ansible_facts' from source: unknown 15500 1727096203.50573: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/AnsiballZ_setup.py 15500 1727096203.50686: Sending initial data 15500 1727096203.50689: Sent initial data (154 bytes) 15500 1727096203.51150: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096203.51154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.51158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096203.51161: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096203.51164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.51208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096203.51218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096203.51301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096203.53690: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096203.53722: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096203.53794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096203.53904: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp0jnn8m2h /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/AnsiballZ_setup.py <<< 15500 1727096203.53912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/AnsiballZ_setup.py" <<< 15500 1727096203.53952: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp0jnn8m2h" to remote "/root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/AnsiballZ_setup.py" <<< 15500 1727096203.55963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096203.55970: stdout chunk (state=3): >>><<< 15500 1727096203.55973: stderr chunk (state=3): >>><<< 15500 1727096203.55975: done transferring module to remote 15500 1727096203.55977: _low_level_execute_command(): starting 15500 1727096203.55979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/ /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/AnsiballZ_setup.py && sleep 0' 15500 1727096203.56449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096203.56453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.56456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096203.56462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.56513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096203.56519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096203.56522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096203.56592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096203.59215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096203.59219: stdout chunk (state=3): >>><<< 15500 1727096203.59221: stderr chunk (state=3): >>><<< 15500 1727096203.59240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096203.59273: _low_level_execute_command(): starting 15500 1727096203.59276: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/AnsiballZ_setup.py && sleep 0' 15500 1727096203.59926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096203.59981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096203.60052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096203.60073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096203.60095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096203.60202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096204.42524: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 357, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797392384, "block_size": 4096, "block_total": 65519099, "block_available": 63915379, "block_used": 1603720, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "44", "epoch": "1727096204", "epoch_int": "1727096204", "date": "2024-09-23", "time": "08:56:44", "iso8601_micro": "2024-09-23T12:56:44.363050Z", "iso8601": "2024-09-23T12:56:44Z", "iso8601_basic": "20240923T085644363050", "iso8601_basic_short": "20240923T085644", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_loadavg": {"1m": 0.4970703125, "5m": 0.3251953125, "15m": 0.1484375}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096204.45777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096204.45781: stdout chunk (state=3): >>><<< 15500 1727096204.45783: stderr chunk (state=3): >>><<< 15500 1727096204.45789: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 357, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797392384, "block_size": 4096, "block_total": 65519099, "block_available": 63915379, "block_used": 1603720, "inode_total": 131070960, "inode_available": 131029096, "inode_used": 41864, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "44", "epoch": "1727096204", "epoch_int": "1727096204", "date": "2024-09-23", "time": "08:56:44", "iso8601_micro": "2024-09-23T12:56:44.363050Z", "iso8601": "2024-09-23T12:56:44Z", "iso8601_basic": "20240923T085644363050", "iso8601_basic_short": "20240923T085644", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_loadavg": {"1m": 0.4970703125, "5m": 0.3251953125, "15m": 0.1484375}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096204.46726: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096204.46730: _low_level_execute_command(): starting 15500 1727096204.46733: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096203.4677193-15658-190259452341070/ > /dev/null 2>&1 && sleep 0' 15500 1727096204.48106: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096204.48216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096204.48334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096204.48355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096204.48450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096204.48538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096204.51243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096204.51495: stdout chunk (state=3): >>><<< 15500 1727096204.51499: stderr chunk (state=3): >>><<< 15500 1727096204.51502: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096204.51504: handler run complete 15500 1727096204.51665: variable 'ansible_facts' from source: unknown 15500 1727096204.51901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.52393: variable 'ansible_facts' from source: unknown 15500 1727096204.52486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.52674: attempt loop complete, returning result 15500 1727096204.52677: _execute() done 15500 1727096204.52679: dumping result to json 15500 1727096204.52681: done dumping result, returning 15500 1727096204.52687: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-0000000000da] 15500 1727096204.52695: sending task result for task 0afff68d-5257-877d-2da0-0000000000da ok: [managed_node1] 15500 1727096204.53631: no more pending results, returning what we have 15500 1727096204.53634: results queue empty 15500 1727096204.53635: checking for any_errors_fatal 15500 1727096204.53637: done checking for any_errors_fatal 15500 1727096204.53637: checking for max_fail_percentage 15500 1727096204.53639: done checking for max_fail_percentage 15500 1727096204.53639: checking to see if all hosts have failed and the running result is not ok 15500 1727096204.53640: done checking to see if all hosts have failed 15500 1727096204.53641: getting the remaining hosts for this loop 15500 1727096204.53642: done getting the remaining hosts for this loop 15500 1727096204.53646: getting the next task for host managed_node1 15500 1727096204.53651: done getting next task for host managed_node1 15500 1727096204.53653: ^ task is: TASK: meta (flush_handlers) 15500 1727096204.53655: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096204.53661: getting variables 15500 1727096204.53662: in VariableManager get_vars() 15500 1727096204.53684: Calling all_inventory to load vars for managed_node1 15500 1727096204.53687: Calling groups_inventory to load vars for managed_node1 15500 1727096204.53690: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.53697: done sending task result for task 0afff68d-5257-877d-2da0-0000000000da 15500 1727096204.53700: WORKER PROCESS EXITING 15500 1727096204.53710: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.53713: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.53716: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.53881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.54070: done with get_vars() 15500 1727096204.54080: done getting variables 15500 1727096204.54145: in VariableManager get_vars() 15500 1727096204.54155: Calling all_inventory to load vars for managed_node1 15500 1727096204.54160: Calling groups_inventory to load vars for managed_node1 15500 1727096204.54162: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.54166: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.54172: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.54175: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.54327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.54516: done with get_vars() 15500 1727096204.54530: done queuing things up, now waiting for results queue to drain 15500 1727096204.54532: results queue empty 15500 1727096204.54533: checking for any_errors_fatal 15500 1727096204.54536: done checking for any_errors_fatal 15500 1727096204.54537: checking for max_fail_percentage 15500 1727096204.54538: done checking for max_fail_percentage 15500 1727096204.54538: checking to see if all hosts have failed and the running result is not ok 15500 1727096204.54543: done checking to see if all hosts have failed 15500 1727096204.54544: getting the remaining hosts for this loop 15500 1727096204.54544: done getting the remaining hosts for this loop 15500 1727096204.54547: getting the next task for host managed_node1 15500 1727096204.54551: done getting next task for host managed_node1 15500 1727096204.54553: ^ task is: TASK: Set interface={{ interface }} 15500 1727096204.54555: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096204.54559: getting variables 15500 1727096204.54560: in VariableManager get_vars() 15500 1727096204.54571: Calling all_inventory to load vars for managed_node1 15500 1727096204.54573: Calling groups_inventory to load vars for managed_node1 15500 1727096204.54575: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.54579: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.54581: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.54584: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.54863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.55051: done with get_vars() 15500 1727096204.55063: done getting variables 15500 1727096204.55107: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096204.55229: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Monday 23 September 2024 08:56:44 -0400 (0:00:01.130) 0:00:04.595 ****** 15500 1727096204.55278: entering _queue_task() for managed_node1/set_fact 15500 1727096204.55794: worker is 1 (out of 1 available) 15500 1727096204.55807: exiting _queue_task() for managed_node1/set_fact 15500 1727096204.55820: done queuing things up, now waiting for results queue to drain 15500 1727096204.55821: waiting for pending results... 15500 1727096204.56319: running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 15500 1727096204.56677: in run() - task 0afff68d-5257-877d-2da0-00000000000b 15500 1727096204.56682: variable 'ansible_search_path' from source: unknown 15500 1727096204.56699: calling self._execute() 15500 1727096204.57075: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.57079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.57083: variable 'omit' from source: magic vars 15500 1727096204.57636: variable 'ansible_distribution_major_version' from source: facts 15500 1727096204.57743: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096204.57759: variable 'omit' from source: magic vars 15500 1727096204.57798: variable 'omit' from source: magic vars 15500 1727096204.57875: variable 'interface' from source: play vars 15500 1727096204.58082: variable 'interface' from source: play vars 15500 1727096204.58105: variable 'omit' from source: magic vars 15500 1727096204.58154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096204.58331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096204.58425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096204.58577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096204.58580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096204.58583: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096204.58586: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.58588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.58946: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096204.58950: Set connection var ansible_pipelining to False 15500 1727096204.58953: Set connection var ansible_timeout to 10 15500 1727096204.58955: Set connection var ansible_shell_type to sh 15500 1727096204.59385: Set connection var ansible_shell_executable to /bin/sh 15500 1727096204.59389: Set connection var ansible_connection to ssh 15500 1727096204.59391: variable 'ansible_shell_executable' from source: unknown 15500 1727096204.59394: variable 'ansible_connection' from source: unknown 15500 1727096204.59396: variable 'ansible_module_compression' from source: unknown 15500 1727096204.59398: variable 'ansible_shell_type' from source: unknown 15500 1727096204.59400: variable 'ansible_shell_executable' from source: unknown 15500 1727096204.59402: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.59404: variable 'ansible_pipelining' from source: unknown 15500 1727096204.59406: variable 'ansible_timeout' from source: unknown 15500 1727096204.59408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.59773: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096204.60073: variable 'omit' from source: magic vars 15500 1727096204.60077: starting attempt loop 15500 1727096204.60080: running the handler 15500 1727096204.60082: handler run complete 15500 1727096204.60084: attempt loop complete, returning result 15500 1727096204.60087: _execute() done 15500 1727096204.60089: dumping result to json 15500 1727096204.60091: done dumping result, returning 15500 1727096204.60094: done running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 [0afff68d-5257-877d-2da0-00000000000b] 15500 1727096204.60096: sending task result for task 0afff68d-5257-877d-2da0-00000000000b 15500 1727096204.60166: done sending task result for task 0afff68d-5257-877d-2da0-00000000000b 15500 1727096204.60171: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 15500 1727096204.60223: no more pending results, returning what we have 15500 1727096204.60226: results queue empty 15500 1727096204.60227: checking for any_errors_fatal 15500 1727096204.60229: done checking for any_errors_fatal 15500 1727096204.60230: checking for max_fail_percentage 15500 1727096204.60231: done checking for max_fail_percentage 15500 1727096204.60233: checking to see if all hosts have failed and the running result is not ok 15500 1727096204.60234: done checking to see if all hosts have failed 15500 1727096204.60234: getting the remaining hosts for this loop 15500 1727096204.60236: done getting the remaining hosts for this loop 15500 1727096204.60240: getting the next task for host managed_node1 15500 1727096204.60246: done getting next task for host managed_node1 15500 1727096204.60249: ^ task is: TASK: Include the task 'show_interfaces.yml' 15500 1727096204.60251: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096204.60255: getting variables 15500 1727096204.60259: in VariableManager get_vars() 15500 1727096204.60293: Calling all_inventory to load vars for managed_node1 15500 1727096204.60296: Calling groups_inventory to load vars for managed_node1 15500 1727096204.60299: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.60311: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.60314: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.60317: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.61109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.61608: done with get_vars() 15500 1727096204.61619: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Monday 23 September 2024 08:56:44 -0400 (0:00:00.064) 0:00:04.660 ****** 15500 1727096204.61715: entering _queue_task() for managed_node1/include_tasks 15500 1727096204.62601: worker is 1 (out of 1 available) 15500 1727096204.62610: exiting _queue_task() for managed_node1/include_tasks 15500 1727096204.62620: done queuing things up, now waiting for results queue to drain 15500 1727096204.62621: waiting for pending results... 15500 1727096204.63038: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 15500 1727096204.63044: in run() - task 0afff68d-5257-877d-2da0-00000000000c 15500 1727096204.63047: variable 'ansible_search_path' from source: unknown 15500 1727096204.63050: calling self._execute() 15500 1727096204.63151: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.63172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.63194: variable 'omit' from source: magic vars 15500 1727096204.63794: variable 'ansible_distribution_major_version' from source: facts 15500 1727096204.63814: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096204.63825: _execute() done 15500 1727096204.63833: dumping result to json 15500 1727096204.63842: done dumping result, returning 15500 1727096204.63853: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0afff68d-5257-877d-2da0-00000000000c] 15500 1727096204.63870: sending task result for task 0afff68d-5257-877d-2da0-00000000000c 15500 1727096204.64176: done sending task result for task 0afff68d-5257-877d-2da0-00000000000c 15500 1727096204.64180: WORKER PROCESS EXITING 15500 1727096204.64208: no more pending results, returning what we have 15500 1727096204.64214: in VariableManager get_vars() 15500 1727096204.64251: Calling all_inventory to load vars for managed_node1 15500 1727096204.64254: Calling groups_inventory to load vars for managed_node1 15500 1727096204.64261: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.64275: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.64279: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.64282: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.64478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.64763: done with get_vars() 15500 1727096204.64774: variable 'ansible_search_path' from source: unknown 15500 1727096204.64788: we have included files to process 15500 1727096204.64789: generating all_blocks data 15500 1727096204.64791: done generating all_blocks data 15500 1727096204.64792: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15500 1727096204.64793: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15500 1727096204.64796: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15500 1727096204.64945: in VariableManager get_vars() 15500 1727096204.64963: done with get_vars() 15500 1727096204.65225: done processing included file 15500 1727096204.65227: iterating over new_blocks loaded from include file 15500 1727096204.65229: in VariableManager get_vars() 15500 1727096204.65246: done with get_vars() 15500 1727096204.65248: filtering new block on tags 15500 1727096204.65272: done filtering new block on tags 15500 1727096204.65274: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 15500 1727096204.65279: extending task lists for all hosts with included blocks 15500 1727096204.65336: done extending task lists 15500 1727096204.65338: done processing included files 15500 1727096204.65339: results queue empty 15500 1727096204.65339: checking for any_errors_fatal 15500 1727096204.65343: done checking for any_errors_fatal 15500 1727096204.65344: checking for max_fail_percentage 15500 1727096204.65345: done checking for max_fail_percentage 15500 1727096204.65346: checking to see if all hosts have failed and the running result is not ok 15500 1727096204.65347: done checking to see if all hosts have failed 15500 1727096204.65348: getting the remaining hosts for this loop 15500 1727096204.65349: done getting the remaining hosts for this loop 15500 1727096204.65351: getting the next task for host managed_node1 15500 1727096204.65355: done getting next task for host managed_node1 15500 1727096204.65360: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15500 1727096204.65363: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096204.65365: getting variables 15500 1727096204.65366: in VariableManager get_vars() 15500 1727096204.65593: Calling all_inventory to load vars for managed_node1 15500 1727096204.65596: Calling groups_inventory to load vars for managed_node1 15500 1727096204.65598: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.65604: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.65606: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.65609: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.65913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.66307: done with get_vars() 15500 1727096204.66317: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Monday 23 September 2024 08:56:44 -0400 (0:00:00.048) 0:00:04.708 ****** 15500 1727096204.66549: entering _queue_task() for managed_node1/include_tasks 15500 1727096204.67035: worker is 1 (out of 1 available) 15500 1727096204.67078: exiting _queue_task() for managed_node1/include_tasks 15500 1727096204.67092: done queuing things up, now waiting for results queue to drain 15500 1727096204.67094: waiting for pending results... 15500 1727096204.67325: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 15500 1727096204.67473: in run() - task 0afff68d-5257-877d-2da0-0000000000ee 15500 1727096204.67478: variable 'ansible_search_path' from source: unknown 15500 1727096204.67481: variable 'ansible_search_path' from source: unknown 15500 1727096204.67515: calling self._execute() 15500 1727096204.67617: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.67621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.67629: variable 'omit' from source: magic vars 15500 1727096204.68053: variable 'ansible_distribution_major_version' from source: facts 15500 1727096204.68059: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096204.68062: _execute() done 15500 1727096204.68065: dumping result to json 15500 1727096204.68069: done dumping result, returning 15500 1727096204.68078: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0afff68d-5257-877d-2da0-0000000000ee] 15500 1727096204.68088: sending task result for task 0afff68d-5257-877d-2da0-0000000000ee 15500 1727096204.68373: done sending task result for task 0afff68d-5257-877d-2da0-0000000000ee 15500 1727096204.68378: WORKER PROCESS EXITING 15500 1727096204.68410: no more pending results, returning what we have 15500 1727096204.68416: in VariableManager get_vars() 15500 1727096204.68454: Calling all_inventory to load vars for managed_node1 15500 1727096204.68459: Calling groups_inventory to load vars for managed_node1 15500 1727096204.68463: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.68480: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.68483: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.68486: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.68802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.69012: done with get_vars() 15500 1727096204.69021: variable 'ansible_search_path' from source: unknown 15500 1727096204.69022: variable 'ansible_search_path' from source: unknown 15500 1727096204.69070: we have included files to process 15500 1727096204.69071: generating all_blocks data 15500 1727096204.69072: done generating all_blocks data 15500 1727096204.69072: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15500 1727096204.69073: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15500 1727096204.69075: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15500 1727096204.69304: done processing included file 15500 1727096204.69306: iterating over new_blocks loaded from include file 15500 1727096204.69307: in VariableManager get_vars() 15500 1727096204.69340: done with get_vars() 15500 1727096204.69342: filtering new block on tags 15500 1727096204.69356: done filtering new block on tags 15500 1727096204.69360: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 15500 1727096204.69363: extending task lists for all hosts with included blocks 15500 1727096204.69424: done extending task lists 15500 1727096204.69425: done processing included files 15500 1727096204.69425: results queue empty 15500 1727096204.69426: checking for any_errors_fatal 15500 1727096204.69428: done checking for any_errors_fatal 15500 1727096204.69428: checking for max_fail_percentage 15500 1727096204.69429: done checking for max_fail_percentage 15500 1727096204.69429: checking to see if all hosts have failed and the running result is not ok 15500 1727096204.69430: done checking to see if all hosts have failed 15500 1727096204.69431: getting the remaining hosts for this loop 15500 1727096204.69433: done getting the remaining hosts for this loop 15500 1727096204.69435: getting the next task for host managed_node1 15500 1727096204.69438: done getting next task for host managed_node1 15500 1727096204.69439: ^ task is: TASK: Gather current interface info 15500 1727096204.69441: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096204.69443: getting variables 15500 1727096204.69443: in VariableManager get_vars() 15500 1727096204.69449: Calling all_inventory to load vars for managed_node1 15500 1727096204.69451: Calling groups_inventory to load vars for managed_node1 15500 1727096204.69452: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096204.69458: Calling all_plugins_play to load vars for managed_node1 15500 1727096204.69460: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096204.69463: Calling groups_plugins_play to load vars for managed_node1 15500 1727096204.69546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096204.69654: done with get_vars() 15500 1727096204.69663: done getting variables 15500 1727096204.69693: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Monday 23 September 2024 08:56:44 -0400 (0:00:00.031) 0:00:04.740 ****** 15500 1727096204.69713: entering _queue_task() for managed_node1/command 15500 1727096204.69951: worker is 1 (out of 1 available) 15500 1727096204.69964: exiting _queue_task() for managed_node1/command 15500 1727096204.69979: done queuing things up, now waiting for results queue to drain 15500 1727096204.69980: waiting for pending results... 15500 1727096204.70128: running TaskExecutor() for managed_node1/TASK: Gather current interface info 15500 1727096204.70188: in run() - task 0afff68d-5257-877d-2da0-0000000000fd 15500 1727096204.70201: variable 'ansible_search_path' from source: unknown 15500 1727096204.70205: variable 'ansible_search_path' from source: unknown 15500 1727096204.70231: calling self._execute() 15500 1727096204.70293: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.70296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.70306: variable 'omit' from source: magic vars 15500 1727096204.70577: variable 'ansible_distribution_major_version' from source: facts 15500 1727096204.70584: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096204.70590: variable 'omit' from source: magic vars 15500 1727096204.70618: variable 'omit' from source: magic vars 15500 1727096204.70644: variable 'omit' from source: magic vars 15500 1727096204.70679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096204.70708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096204.70723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096204.70751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096204.70756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096204.70786: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096204.70791: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.70794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.71095: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096204.71098: Set connection var ansible_pipelining to False 15500 1727096204.71101: Set connection var ansible_timeout to 10 15500 1727096204.71103: Set connection var ansible_shell_type to sh 15500 1727096204.71105: Set connection var ansible_shell_executable to /bin/sh 15500 1727096204.71107: Set connection var ansible_connection to ssh 15500 1727096204.71109: variable 'ansible_shell_executable' from source: unknown 15500 1727096204.71111: variable 'ansible_connection' from source: unknown 15500 1727096204.71113: variable 'ansible_module_compression' from source: unknown 15500 1727096204.71115: variable 'ansible_shell_type' from source: unknown 15500 1727096204.71117: variable 'ansible_shell_executable' from source: unknown 15500 1727096204.71119: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096204.71121: variable 'ansible_pipelining' from source: unknown 15500 1727096204.71123: variable 'ansible_timeout' from source: unknown 15500 1727096204.71125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096204.71225: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096204.71242: variable 'omit' from source: magic vars 15500 1727096204.71251: starting attempt loop 15500 1727096204.71260: running the handler 15500 1727096204.71282: _low_level_execute_command(): starting 15500 1727096204.71301: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096204.72085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096204.72107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096204.72146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096204.72176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096204.72245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096204.74684: stdout chunk (state=3): >>>/root <<< 15500 1727096204.74890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096204.74895: stdout chunk (state=3): >>><<< 15500 1727096204.74898: stderr chunk (state=3): >>><<< 15500 1727096204.74920: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096204.74996: _low_level_execute_command(): starting 15500 1727096204.75002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572 `" && echo ansible-tmp-1727096204.749282-15709-151376287011572="` echo /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572 `" ) && sleep 0' 15500 1727096204.75674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096204.75680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096204.75683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096204.75686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096204.75688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096204.75690: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096204.75773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096204.75789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096204.75889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096204.78578: stdout chunk (state=3): >>>ansible-tmp-1727096204.749282-15709-151376287011572=/root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572 <<< 15500 1727096204.78860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096204.78864: stdout chunk (state=3): >>><<< 15500 1727096204.79280: stderr chunk (state=3): >>><<< 15500 1727096204.79284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096204.749282-15709-151376287011572=/root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096204.79287: variable 'ansible_module_compression' from source: unknown 15500 1727096204.79289: ANSIBALLZ: Using generic lock for ansible.legacy.command 15500 1727096204.79291: ANSIBALLZ: Acquiring lock 15500 1727096204.79293: ANSIBALLZ: Lock acquired: 140712178847904 15500 1727096204.79295: ANSIBALLZ: Creating module 15500 1727096204.91506: ANSIBALLZ: Writing module into payload 15500 1727096204.91573: ANSIBALLZ: Writing module 15500 1727096204.91593: ANSIBALLZ: Renaming module 15500 1727096204.91596: ANSIBALLZ: Done creating module 15500 1727096204.91628: variable 'ansible_facts' from source: unknown 15500 1727096204.91682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/AnsiballZ_command.py 15500 1727096204.91822: Sending initial data 15500 1727096204.91825: Sent initial data (155 bytes) 15500 1727096204.92583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096204.92587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096204.92593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096204.92595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096204.92633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096204.92717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15500 1727096204.94567: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096204.94625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096204.94699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpzhkvj3ws /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/AnsiballZ_command.py <<< 15500 1727096204.94703: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/AnsiballZ_command.py" <<< 15500 1727096204.94773: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpzhkvj3ws" to remote "/root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/AnsiballZ_command.py" <<< 15500 1727096204.95425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096204.95597: stderr chunk (state=3): >>><<< 15500 1727096204.95600: stdout chunk (state=3): >>><<< 15500 1727096204.95602: done transferring module to remote 15500 1727096204.95604: _low_level_execute_command(): starting 15500 1727096204.95607: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/ /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/AnsiballZ_command.py && sleep 0' 15500 1727096204.96327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096204.96343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096204.96357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096204.96385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096204.96403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096204.96420: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096204.96501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096204.96541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096204.96564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096204.96610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096204.96717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096204.99025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096204.99235: stderr chunk (state=3): >>><<< 15500 1727096204.99239: stdout chunk (state=3): >>><<< 15500 1727096204.99336: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096204.99340: _low_level_execute_command(): starting 15500 1727096204.99343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/AnsiballZ_command.py && sleep 0' 15500 1727096205.00122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096205.00139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096205.00152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096205.00178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096205.00290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.24907: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:56:45.243139", "end": "2024-09-23 08:56:46.247648", "delta": "0:00:01.004509", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15500 1727096206.26700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096206.26719: stdout chunk (state=3): >>><<< 15500 1727096206.26773: stderr chunk (state=3): >>><<< 15500 1727096206.26777: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-23 08:56:45.243139", "end": "2024-09-23 08:56:46.247648", "delta": "0:00:01.004509", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096206.26819: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096206.26837: _low_level_execute_command(): starting 15500 1727096206.26923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096204.749282-15709-151376287011572/ > /dev/null 2>&1 && sleep 0' 15500 1727096206.27509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096206.27591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.27658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.29779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096206.29797: stderr chunk (state=3): >>><<< 15500 1727096206.29808: stdout chunk (state=3): >>><<< 15500 1727096206.29834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096206.29847: handler run complete 15500 1727096206.29976: Evaluated conditional (False): False 15500 1727096206.29980: attempt loop complete, returning result 15500 1727096206.29983: _execute() done 15500 1727096206.29985: dumping result to json 15500 1727096206.29987: done dumping result, returning 15500 1727096206.29990: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0afff68d-5257-877d-2da0-0000000000fd] 15500 1727096206.29992: sending task result for task 0afff68d-5257-877d-2da0-0000000000fd 15500 1727096206.30069: done sending task result for task 0afff68d-5257-877d-2da0-0000000000fd 15500 1727096206.30073: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:01.004509", "end": "2024-09-23 08:56:46.247648", "rc": 0, "start": "2024-09-23 08:56:45.243139" } STDOUT: bonding_masters eth0 lo 15500 1727096206.30148: no more pending results, returning what we have 15500 1727096206.30151: results queue empty 15500 1727096206.30152: checking for any_errors_fatal 15500 1727096206.30154: done checking for any_errors_fatal 15500 1727096206.30154: checking for max_fail_percentage 15500 1727096206.30156: done checking for max_fail_percentage 15500 1727096206.30157: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.30158: done checking to see if all hosts have failed 15500 1727096206.30158: getting the remaining hosts for this loop 15500 1727096206.30160: done getting the remaining hosts for this loop 15500 1727096206.30163: getting the next task for host managed_node1 15500 1727096206.30276: done getting next task for host managed_node1 15500 1727096206.30280: ^ task is: TASK: Set current_interfaces 15500 1727096206.30284: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.30287: getting variables 15500 1727096206.30289: in VariableManager get_vars() 15500 1727096206.30329: Calling all_inventory to load vars for managed_node1 15500 1727096206.30332: Calling groups_inventory to load vars for managed_node1 15500 1727096206.30336: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.30347: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.30350: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.30353: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.31097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.31423: done with get_vars() 15500 1727096206.31436: done getting variables 15500 1727096206.31614: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Monday 23 September 2024 08:56:46 -0400 (0:00:01.619) 0:00:06.359 ****** 15500 1727096206.31643: entering _queue_task() for managed_node1/set_fact 15500 1727096206.32323: worker is 1 (out of 1 available) 15500 1727096206.32336: exiting _queue_task() for managed_node1/set_fact 15500 1727096206.32349: done queuing things up, now waiting for results queue to drain 15500 1727096206.32350: waiting for pending results... 15500 1727096206.33190: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 15500 1727096206.33195: in run() - task 0afff68d-5257-877d-2da0-0000000000fe 15500 1727096206.33198: variable 'ansible_search_path' from source: unknown 15500 1727096206.33200: variable 'ansible_search_path' from source: unknown 15500 1727096206.33202: calling self._execute() 15500 1727096206.33578: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.33583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.33586: variable 'omit' from source: magic vars 15500 1727096206.34224: variable 'ansible_distribution_major_version' from source: facts 15500 1727096206.34248: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096206.34263: variable 'omit' from source: magic vars 15500 1727096206.34313: variable 'omit' from source: magic vars 15500 1727096206.34429: variable '_current_interfaces' from source: set_fact 15500 1727096206.34505: variable 'omit' from source: magic vars 15500 1727096206.34550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096206.34600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096206.34627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096206.34651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.34677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.34712: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096206.34722: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.34733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.34845: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096206.34860: Set connection var ansible_pipelining to False 15500 1727096206.34874: Set connection var ansible_timeout to 10 15500 1727096206.34882: Set connection var ansible_shell_type to sh 15500 1727096206.34972: Set connection var ansible_shell_executable to /bin/sh 15500 1727096206.34975: Set connection var ansible_connection to ssh 15500 1727096206.34978: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.34980: variable 'ansible_connection' from source: unknown 15500 1727096206.34983: variable 'ansible_module_compression' from source: unknown 15500 1727096206.34985: variable 'ansible_shell_type' from source: unknown 15500 1727096206.34987: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.34989: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.34991: variable 'ansible_pipelining' from source: unknown 15500 1727096206.34994: variable 'ansible_timeout' from source: unknown 15500 1727096206.35001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.35154: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096206.35179: variable 'omit' from source: magic vars 15500 1727096206.35197: starting attempt loop 15500 1727096206.35204: running the handler 15500 1727096206.35224: handler run complete 15500 1727096206.35239: attempt loop complete, returning result 15500 1727096206.35246: _execute() done 15500 1727096206.35253: dumping result to json 15500 1727096206.35264: done dumping result, returning 15500 1727096206.35326: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0afff68d-5257-877d-2da0-0000000000fe] 15500 1727096206.35329: sending task result for task 0afff68d-5257-877d-2da0-0000000000fe 15500 1727096206.35399: done sending task result for task 0afff68d-5257-877d-2da0-0000000000fe 15500 1727096206.35402: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15500 1727096206.35492: no more pending results, returning what we have 15500 1727096206.35496: results queue empty 15500 1727096206.35496: checking for any_errors_fatal 15500 1727096206.35506: done checking for any_errors_fatal 15500 1727096206.35507: checking for max_fail_percentage 15500 1727096206.35508: done checking for max_fail_percentage 15500 1727096206.35509: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.35510: done checking to see if all hosts have failed 15500 1727096206.35511: getting the remaining hosts for this loop 15500 1727096206.35512: done getting the remaining hosts for this loop 15500 1727096206.35516: getting the next task for host managed_node1 15500 1727096206.35525: done getting next task for host managed_node1 15500 1727096206.35528: ^ task is: TASK: Show current_interfaces 15500 1727096206.35531: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.35535: getting variables 15500 1727096206.35537: in VariableManager get_vars() 15500 1727096206.35572: Calling all_inventory to load vars for managed_node1 15500 1727096206.35576: Calling groups_inventory to load vars for managed_node1 15500 1727096206.35580: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.35590: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.35593: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.35597: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.35890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.36246: done with get_vars() 15500 1727096206.36261: done getting variables 15500 1727096206.36360: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Monday 23 September 2024 08:56:46 -0400 (0:00:00.047) 0:00:06.407 ****** 15500 1727096206.36394: entering _queue_task() for managed_node1/debug 15500 1727096206.36396: Creating lock for debug 15500 1727096206.36804: worker is 1 (out of 1 available) 15500 1727096206.36817: exiting _queue_task() for managed_node1/debug 15500 1727096206.36828: done queuing things up, now waiting for results queue to drain 15500 1727096206.36829: waiting for pending results... 15500 1727096206.37085: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 15500 1727096206.37164: in run() - task 0afff68d-5257-877d-2da0-0000000000ef 15500 1727096206.37170: variable 'ansible_search_path' from source: unknown 15500 1727096206.37173: variable 'ansible_search_path' from source: unknown 15500 1727096206.37204: calling self._execute() 15500 1727096206.37294: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.37372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.37377: variable 'omit' from source: magic vars 15500 1727096206.37764: variable 'ansible_distribution_major_version' from source: facts 15500 1727096206.37781: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096206.37790: variable 'omit' from source: magic vars 15500 1727096206.37831: variable 'omit' from source: magic vars 15500 1727096206.37932: variable 'current_interfaces' from source: set_fact 15500 1727096206.37969: variable 'omit' from source: magic vars 15500 1727096206.38012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096206.38059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096206.38098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096206.38144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.38147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.38176: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096206.38185: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.38192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.38338: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096206.38341: Set connection var ansible_pipelining to False 15500 1727096206.38343: Set connection var ansible_timeout to 10 15500 1727096206.38345: Set connection var ansible_shell_type to sh 15500 1727096206.38346: Set connection var ansible_shell_executable to /bin/sh 15500 1727096206.38348: Set connection var ansible_connection to ssh 15500 1727096206.38373: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.38381: variable 'ansible_connection' from source: unknown 15500 1727096206.38388: variable 'ansible_module_compression' from source: unknown 15500 1727096206.38394: variable 'ansible_shell_type' from source: unknown 15500 1727096206.38401: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.38407: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.38415: variable 'ansible_pipelining' from source: unknown 15500 1727096206.38422: variable 'ansible_timeout' from source: unknown 15500 1727096206.38442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.38737: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096206.38741: variable 'omit' from source: magic vars 15500 1727096206.38743: starting attempt loop 15500 1727096206.38745: running the handler 15500 1727096206.38747: handler run complete 15500 1727096206.38749: attempt loop complete, returning result 15500 1727096206.38751: _execute() done 15500 1727096206.38753: dumping result to json 15500 1727096206.38755: done dumping result, returning 15500 1727096206.38759: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0afff68d-5257-877d-2da0-0000000000ef] 15500 1727096206.38761: sending task result for task 0afff68d-5257-877d-2da0-0000000000ef 15500 1727096206.38831: done sending task result for task 0afff68d-5257-877d-2da0-0000000000ef 15500 1727096206.38835: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15500 1727096206.38902: no more pending results, returning what we have 15500 1727096206.38906: results queue empty 15500 1727096206.38907: checking for any_errors_fatal 15500 1727096206.38911: done checking for any_errors_fatal 15500 1727096206.38912: checking for max_fail_percentage 15500 1727096206.38913: done checking for max_fail_percentage 15500 1727096206.38914: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.38915: done checking to see if all hosts have failed 15500 1727096206.38916: getting the remaining hosts for this loop 15500 1727096206.38917: done getting the remaining hosts for this loop 15500 1727096206.38921: getting the next task for host managed_node1 15500 1727096206.38930: done getting next task for host managed_node1 15500 1727096206.38933: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15500 1727096206.38935: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.38939: getting variables 15500 1727096206.38941: in VariableManager get_vars() 15500 1727096206.38975: Calling all_inventory to load vars for managed_node1 15500 1727096206.38979: Calling groups_inventory to load vars for managed_node1 15500 1727096206.38983: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.38994: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.38997: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.39000: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.39534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.39729: done with get_vars() 15500 1727096206.39740: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Monday 23 September 2024 08:56:46 -0400 (0:00:00.034) 0:00:06.441 ****** 15500 1727096206.39833: entering _queue_task() for managed_node1/include_tasks 15500 1727096206.40907: worker is 1 (out of 1 available) 15500 1727096206.40919: exiting _queue_task() for managed_node1/include_tasks 15500 1727096206.40930: done queuing things up, now waiting for results queue to drain 15500 1727096206.40931: waiting for pending results... 15500 1727096206.41587: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 15500 1727096206.41593: in run() - task 0afff68d-5257-877d-2da0-00000000000d 15500 1727096206.41596: variable 'ansible_search_path' from source: unknown 15500 1727096206.41598: calling self._execute() 15500 1727096206.42174: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.42180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.42183: variable 'omit' from source: magic vars 15500 1727096206.43276: variable 'ansible_distribution_major_version' from source: facts 15500 1727096206.43281: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096206.43283: _execute() done 15500 1727096206.43286: dumping result to json 15500 1727096206.43288: done dumping result, returning 15500 1727096206.43290: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0afff68d-5257-877d-2da0-00000000000d] 15500 1727096206.43292: sending task result for task 0afff68d-5257-877d-2da0-00000000000d 15500 1727096206.43363: done sending task result for task 0afff68d-5257-877d-2da0-00000000000d 15500 1727096206.43366: WORKER PROCESS EXITING 15500 1727096206.43597: no more pending results, returning what we have 15500 1727096206.43603: in VariableManager get_vars() 15500 1727096206.43641: Calling all_inventory to load vars for managed_node1 15500 1727096206.43644: Calling groups_inventory to load vars for managed_node1 15500 1727096206.43648: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.43664: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.43669: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.43672: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.43871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.44683: done with get_vars() 15500 1727096206.44692: variable 'ansible_search_path' from source: unknown 15500 1727096206.44706: we have included files to process 15500 1727096206.44708: generating all_blocks data 15500 1727096206.44709: done generating all_blocks data 15500 1727096206.44712: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15500 1727096206.44714: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15500 1727096206.44716: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15500 1727096206.45292: in VariableManager get_vars() 15500 1727096206.45310: done with get_vars() 15500 1727096206.45995: done processing included file 15500 1727096206.45997: iterating over new_blocks loaded from include file 15500 1727096206.45999: in VariableManager get_vars() 15500 1727096206.46012: done with get_vars() 15500 1727096206.46014: filtering new block on tags 15500 1727096206.46033: done filtering new block on tags 15500 1727096206.46035: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 15500 1727096206.46041: extending task lists for all hosts with included blocks 15500 1727096206.46606: done extending task lists 15500 1727096206.46607: done processing included files 15500 1727096206.46608: results queue empty 15500 1727096206.46609: checking for any_errors_fatal 15500 1727096206.46612: done checking for any_errors_fatal 15500 1727096206.46613: checking for max_fail_percentage 15500 1727096206.46614: done checking for max_fail_percentage 15500 1727096206.46615: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.46616: done checking to see if all hosts have failed 15500 1727096206.46616: getting the remaining hosts for this loop 15500 1727096206.46618: done getting the remaining hosts for this loop 15500 1727096206.46621: getting the next task for host managed_node1 15500 1727096206.46625: done getting next task for host managed_node1 15500 1727096206.46627: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15500 1727096206.46631: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.46634: getting variables 15500 1727096206.46635: in VariableManager get_vars() 15500 1727096206.46646: Calling all_inventory to load vars for managed_node1 15500 1727096206.46648: Calling groups_inventory to load vars for managed_node1 15500 1727096206.46650: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.46659: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.46662: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.46665: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.46929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.47121: done with get_vars() 15500 1727096206.47132: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Monday 23 September 2024 08:56:46 -0400 (0:00:00.073) 0:00:06.515 ****** 15500 1727096206.47213: entering _queue_task() for managed_node1/include_tasks 15500 1727096206.47535: worker is 1 (out of 1 available) 15500 1727096206.47548: exiting _queue_task() for managed_node1/include_tasks 15500 1727096206.47563: done queuing things up, now waiting for results queue to drain 15500 1727096206.47564: waiting for pending results... 15500 1727096206.47819: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15500 1727096206.47926: in run() - task 0afff68d-5257-877d-2da0-000000000119 15500 1727096206.47947: variable 'ansible_search_path' from source: unknown 15500 1727096206.47956: variable 'ansible_search_path' from source: unknown 15500 1727096206.48103: calling self._execute() 15500 1727096206.48107: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.48110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.48112: variable 'omit' from source: magic vars 15500 1727096206.48472: variable 'ansible_distribution_major_version' from source: facts 15500 1727096206.48490: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096206.48499: _execute() done 15500 1727096206.48505: dumping result to json 15500 1727096206.48511: done dumping result, returning 15500 1727096206.48519: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-877d-2da0-000000000119] 15500 1727096206.48526: sending task result for task 0afff68d-5257-877d-2da0-000000000119 15500 1727096206.48670: no more pending results, returning what we have 15500 1727096206.48676: in VariableManager get_vars() 15500 1727096206.48711: Calling all_inventory to load vars for managed_node1 15500 1727096206.48715: Calling groups_inventory to load vars for managed_node1 15500 1727096206.48719: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.48732: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.48734: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.48737: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.49184: done sending task result for task 0afff68d-5257-877d-2da0-000000000119 15500 1727096206.49188: WORKER PROCESS EXITING 15500 1727096206.49212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.49399: done with get_vars() 15500 1727096206.49407: variable 'ansible_search_path' from source: unknown 15500 1727096206.49408: variable 'ansible_search_path' from source: unknown 15500 1727096206.49443: we have included files to process 15500 1727096206.49444: generating all_blocks data 15500 1727096206.49445: done generating all_blocks data 15500 1727096206.49447: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096206.49448: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096206.49450: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096206.49673: done processing included file 15500 1727096206.49675: iterating over new_blocks loaded from include file 15500 1727096206.49676: in VariableManager get_vars() 15500 1727096206.49688: done with get_vars() 15500 1727096206.49690: filtering new block on tags 15500 1727096206.49705: done filtering new block on tags 15500 1727096206.49707: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15500 1727096206.49711: extending task lists for all hosts with included blocks 15500 1727096206.49809: done extending task lists 15500 1727096206.49810: done processing included files 15500 1727096206.49811: results queue empty 15500 1727096206.49812: checking for any_errors_fatal 15500 1727096206.49814: done checking for any_errors_fatal 15500 1727096206.49815: checking for max_fail_percentage 15500 1727096206.49816: done checking for max_fail_percentage 15500 1727096206.49816: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.49817: done checking to see if all hosts have failed 15500 1727096206.49818: getting the remaining hosts for this loop 15500 1727096206.49819: done getting the remaining hosts for this loop 15500 1727096206.49822: getting the next task for host managed_node1 15500 1727096206.49826: done getting next task for host managed_node1 15500 1727096206.49828: ^ task is: TASK: Get stat for interface {{ interface }} 15500 1727096206.49831: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.49833: getting variables 15500 1727096206.49834: in VariableManager get_vars() 15500 1727096206.49843: Calling all_inventory to load vars for managed_node1 15500 1727096206.49845: Calling groups_inventory to load vars for managed_node1 15500 1727096206.49847: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.49852: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.49854: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.49859: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.50002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.50187: done with get_vars() 15500 1727096206.50196: done getting variables 15500 1727096206.50346: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:56:46 -0400 (0:00:00.031) 0:00:06.546 ****** 15500 1727096206.50380: entering _queue_task() for managed_node1/stat 15500 1727096206.50693: worker is 1 (out of 1 available) 15500 1727096206.50706: exiting _queue_task() for managed_node1/stat 15500 1727096206.50719: done queuing things up, now waiting for results queue to drain 15500 1727096206.50720: waiting for pending results... 15500 1727096206.50985: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15500 1727096206.51107: in run() - task 0afff68d-5257-877d-2da0-000000000133 15500 1727096206.51127: variable 'ansible_search_path' from source: unknown 15500 1727096206.51135: variable 'ansible_search_path' from source: unknown 15500 1727096206.51179: calling self._execute() 15500 1727096206.51265: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.51280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.51296: variable 'omit' from source: magic vars 15500 1727096206.52002: variable 'ansible_distribution_major_version' from source: facts 15500 1727096206.52018: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096206.52028: variable 'omit' from source: magic vars 15500 1727096206.52086: variable 'omit' from source: magic vars 15500 1727096206.52188: variable 'interface' from source: set_fact 15500 1727096206.52213: variable 'omit' from source: magic vars 15500 1727096206.52254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096206.52297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096206.52327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096206.52348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.52368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.52403: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096206.52417: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.52426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.52537: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096206.52548: Set connection var ansible_pipelining to False 15500 1727096206.52559: Set connection var ansible_timeout to 10 15500 1727096206.52566: Set connection var ansible_shell_type to sh 15500 1727096206.52577: Set connection var ansible_shell_executable to /bin/sh 15500 1727096206.52585: Set connection var ansible_connection to ssh 15500 1727096206.52610: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.52618: variable 'ansible_connection' from source: unknown 15500 1727096206.52627: variable 'ansible_module_compression' from source: unknown 15500 1727096206.52772: variable 'ansible_shell_type' from source: unknown 15500 1727096206.52776: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.52778: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.52780: variable 'ansible_pipelining' from source: unknown 15500 1727096206.52782: variable 'ansible_timeout' from source: unknown 15500 1727096206.52784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.52864: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096206.52882: variable 'omit' from source: magic vars 15500 1727096206.52899: starting attempt loop 15500 1727096206.52908: running the handler 15500 1727096206.52927: _low_level_execute_command(): starting 15500 1727096206.52941: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096206.53689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096206.53792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.53810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096206.53835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096206.53853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.53965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.55722: stdout chunk (state=3): >>>/root <<< 15500 1727096206.55921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096206.56173: stdout chunk (state=3): >>><<< 15500 1727096206.56177: stderr chunk (state=3): >>><<< 15500 1727096206.56181: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096206.56184: _low_level_execute_command(): starting 15500 1727096206.56187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156 `" && echo ansible-tmp-1727096206.5608077-15797-33431487722156="` echo /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156 `" ) && sleep 0' 15500 1727096206.57595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.57643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096206.57660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096206.57713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.57805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.59835: stdout chunk (state=3): >>>ansible-tmp-1727096206.5608077-15797-33431487722156=/root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156 <<< 15500 1727096206.60047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096206.60058: stdout chunk (state=3): >>><<< 15500 1727096206.60274: stderr chunk (state=3): >>><<< 15500 1727096206.60278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096206.5608077-15797-33431487722156=/root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096206.60281: variable 'ansible_module_compression' from source: unknown 15500 1727096206.60337: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15500 1727096206.60445: variable 'ansible_facts' from source: unknown 15500 1727096206.60683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/AnsiballZ_stat.py 15500 1727096206.61091: Sending initial data 15500 1727096206.61094: Sent initial data (152 bytes) 15500 1727096206.62200: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096206.62215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.62388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096206.62484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096206.62504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.62579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.64276: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096206.64355: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096206.64386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096206.64461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp6rhzcmle /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/AnsiballZ_stat.py <<< 15500 1727096206.64473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/AnsiballZ_stat.py" <<< 15500 1727096206.64524: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp6rhzcmle" to remote "/root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/AnsiballZ_stat.py" <<< 15500 1727096206.65940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096206.66004: stderr chunk (state=3): >>><<< 15500 1727096206.66013: stdout chunk (state=3): >>><<< 15500 1727096206.66177: done transferring module to remote 15500 1727096206.66184: _low_level_execute_command(): starting 15500 1727096206.66186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/ /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/AnsiballZ_stat.py && sleep 0' 15500 1727096206.67339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096206.67388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096206.67403: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096206.67419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.67439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096206.67478: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.67579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096206.67593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.67808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.69741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096206.69994: stderr chunk (state=3): >>><<< 15500 1727096206.69999: stdout chunk (state=3): >>><<< 15500 1727096206.70002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096206.70004: _low_level_execute_command(): starting 15500 1727096206.70006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/AnsiballZ_stat.py && sleep 0' 15500 1727096206.71101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096206.71124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096206.71186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.71254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096206.71279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096206.71322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.71409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.87098: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15500 1727096206.88487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096206.88561: stderr chunk (state=3): >>><<< 15500 1727096206.88565: stdout chunk (state=3): >>><<< 15500 1727096206.88648: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096206.88653: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096206.88656: _low_level_execute_command(): starting 15500 1727096206.88658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096206.5608077-15797-33431487722156/ > /dev/null 2>&1 && sleep 0' 15500 1727096206.89476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096206.89480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096206.89484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.89507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.89558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096206.89562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096206.89564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.89642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096206.91557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096206.91597: stderr chunk (state=3): >>><<< 15500 1727096206.91600: stdout chunk (state=3): >>><<< 15500 1727096206.91616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096206.91621: handler run complete 15500 1727096206.91637: attempt loop complete, returning result 15500 1727096206.91639: _execute() done 15500 1727096206.91642: dumping result to json 15500 1727096206.91644: done dumping result, returning 15500 1727096206.91652: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000133] 15500 1727096206.91658: sending task result for task 0afff68d-5257-877d-2da0-000000000133 15500 1727096206.91766: done sending task result for task 0afff68d-5257-877d-2da0-000000000133 15500 1727096206.91771: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15500 1727096206.92156: no more pending results, returning what we have 15500 1727096206.92158: results queue empty 15500 1727096206.92159: checking for any_errors_fatal 15500 1727096206.92160: done checking for any_errors_fatal 15500 1727096206.92160: checking for max_fail_percentage 15500 1727096206.92161: done checking for max_fail_percentage 15500 1727096206.92162: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.92162: done checking to see if all hosts have failed 15500 1727096206.92163: getting the remaining hosts for this loop 15500 1727096206.92164: done getting the remaining hosts for this loop 15500 1727096206.92166: getting the next task for host managed_node1 15500 1727096206.92173: done getting next task for host managed_node1 15500 1727096206.92175: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15500 1727096206.92177: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.92178: getting variables 15500 1727096206.92179: in VariableManager get_vars() 15500 1727096206.92197: Calling all_inventory to load vars for managed_node1 15500 1727096206.92199: Calling groups_inventory to load vars for managed_node1 15500 1727096206.92201: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.92209: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.92210: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.92212: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.92307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.92416: done with get_vars() 15500 1727096206.92423: done getting variables 15500 1727096206.92495: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15500 1727096206.92579: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Monday 23 September 2024 08:56:46 -0400 (0:00:00.422) 0:00:06.969 ****** 15500 1727096206.92600: entering _queue_task() for managed_node1/assert 15500 1727096206.92601: Creating lock for assert 15500 1727096206.92840: worker is 1 (out of 1 available) 15500 1727096206.92855: exiting _queue_task() for managed_node1/assert 15500 1727096206.92868: done queuing things up, now waiting for results queue to drain 15500 1727096206.92870: waiting for pending results... 15500 1727096206.93025: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15500 1727096206.93103: in run() - task 0afff68d-5257-877d-2da0-00000000011a 15500 1727096206.93113: variable 'ansible_search_path' from source: unknown 15500 1727096206.93116: variable 'ansible_search_path' from source: unknown 15500 1727096206.93147: calling self._execute() 15500 1727096206.93213: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.93219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.93228: variable 'omit' from source: magic vars 15500 1727096206.93520: variable 'ansible_distribution_major_version' from source: facts 15500 1727096206.93535: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096206.93538: variable 'omit' from source: magic vars 15500 1727096206.93570: variable 'omit' from source: magic vars 15500 1727096206.93639: variable 'interface' from source: set_fact 15500 1727096206.93673: variable 'omit' from source: magic vars 15500 1727096206.93689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096206.93725: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096206.93742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096206.93756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.93771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.93795: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096206.93798: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.93801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.93872: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096206.93879: Set connection var ansible_pipelining to False 15500 1727096206.93885: Set connection var ansible_timeout to 10 15500 1727096206.93887: Set connection var ansible_shell_type to sh 15500 1727096206.93892: Set connection var ansible_shell_executable to /bin/sh 15500 1727096206.93898: Set connection var ansible_connection to ssh 15500 1727096206.93914: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.93917: variable 'ansible_connection' from source: unknown 15500 1727096206.93920: variable 'ansible_module_compression' from source: unknown 15500 1727096206.93922: variable 'ansible_shell_type' from source: unknown 15500 1727096206.93925: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.93927: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.93931: variable 'ansible_pipelining' from source: unknown 15500 1727096206.93934: variable 'ansible_timeout' from source: unknown 15500 1727096206.93938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.94043: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096206.94051: variable 'omit' from source: magic vars 15500 1727096206.94059: starting attempt loop 15500 1727096206.94062: running the handler 15500 1727096206.94161: variable 'interface_stat' from source: set_fact 15500 1727096206.94176: Evaluated conditional (not interface_stat.stat.exists): True 15500 1727096206.94180: handler run complete 15500 1727096206.94192: attempt loop complete, returning result 15500 1727096206.94194: _execute() done 15500 1727096206.94197: dumping result to json 15500 1727096206.94201: done dumping result, returning 15500 1727096206.94209: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0afff68d-5257-877d-2da0-00000000011a] 15500 1727096206.94214: sending task result for task 0afff68d-5257-877d-2da0-00000000011a 15500 1727096206.94300: done sending task result for task 0afff68d-5257-877d-2da0-00000000011a 15500 1727096206.94303: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15500 1727096206.94370: no more pending results, returning what we have 15500 1727096206.94374: results queue empty 15500 1727096206.94375: checking for any_errors_fatal 15500 1727096206.94383: done checking for any_errors_fatal 15500 1727096206.94384: checking for max_fail_percentage 15500 1727096206.94386: done checking for max_fail_percentage 15500 1727096206.94386: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.94387: done checking to see if all hosts have failed 15500 1727096206.94388: getting the remaining hosts for this loop 15500 1727096206.94390: done getting the remaining hosts for this loop 15500 1727096206.94393: getting the next task for host managed_node1 15500 1727096206.94401: done getting next task for host managed_node1 15500 1727096206.94403: ^ task is: TASK: meta (flush_handlers) 15500 1727096206.94404: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.94407: getting variables 15500 1727096206.94409: in VariableManager get_vars() 15500 1727096206.94437: Calling all_inventory to load vars for managed_node1 15500 1727096206.94439: Calling groups_inventory to load vars for managed_node1 15500 1727096206.94441: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.94450: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.94452: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.94454: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.94687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.94920: done with get_vars() 15500 1727096206.94932: done getting variables 15500 1727096206.95014: in VariableManager get_vars() 15500 1727096206.95024: Calling all_inventory to load vars for managed_node1 15500 1727096206.95026: Calling groups_inventory to load vars for managed_node1 15500 1727096206.95029: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.95035: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.95037: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.95040: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.95191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.95360: done with get_vars() 15500 1727096206.95376: done queuing things up, now waiting for results queue to drain 15500 1727096206.95378: results queue empty 15500 1727096206.95379: checking for any_errors_fatal 15500 1727096206.95382: done checking for any_errors_fatal 15500 1727096206.95383: checking for max_fail_percentage 15500 1727096206.95384: done checking for max_fail_percentage 15500 1727096206.95384: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.95385: done checking to see if all hosts have failed 15500 1727096206.95391: getting the remaining hosts for this loop 15500 1727096206.95393: done getting the remaining hosts for this loop 15500 1727096206.95395: getting the next task for host managed_node1 15500 1727096206.95398: done getting next task for host managed_node1 15500 1727096206.95399: ^ task is: TASK: meta (flush_handlers) 15500 1727096206.95402: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.95405: getting variables 15500 1727096206.95406: in VariableManager get_vars() 15500 1727096206.95418: Calling all_inventory to load vars for managed_node1 15500 1727096206.95420: Calling groups_inventory to load vars for managed_node1 15500 1727096206.95422: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.95427: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.95430: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.95433: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.95573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.95736: done with get_vars() 15500 1727096206.95742: done getting variables 15500 1727096206.95800: in VariableManager get_vars() 15500 1727096206.95809: Calling all_inventory to load vars for managed_node1 15500 1727096206.95811: Calling groups_inventory to load vars for managed_node1 15500 1727096206.95812: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.95816: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.95817: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.95818: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.95933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.96122: done with get_vars() 15500 1727096206.96134: done queuing things up, now waiting for results queue to drain 15500 1727096206.96135: results queue empty 15500 1727096206.96135: checking for any_errors_fatal 15500 1727096206.96136: done checking for any_errors_fatal 15500 1727096206.96137: checking for max_fail_percentage 15500 1727096206.96138: done checking for max_fail_percentage 15500 1727096206.96138: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.96139: done checking to see if all hosts have failed 15500 1727096206.96139: getting the remaining hosts for this loop 15500 1727096206.96140: done getting the remaining hosts for this loop 15500 1727096206.96142: getting the next task for host managed_node1 15500 1727096206.96144: done getting next task for host managed_node1 15500 1727096206.96145: ^ task is: None 15500 1727096206.96146: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.96147: done queuing things up, now waiting for results queue to drain 15500 1727096206.96147: results queue empty 15500 1727096206.96148: checking for any_errors_fatal 15500 1727096206.96148: done checking for any_errors_fatal 15500 1727096206.96148: checking for max_fail_percentage 15500 1727096206.96149: done checking for max_fail_percentage 15500 1727096206.96149: checking to see if all hosts have failed and the running result is not ok 15500 1727096206.96150: done checking to see if all hosts have failed 15500 1727096206.96151: getting the next task for host managed_node1 15500 1727096206.96153: done getting next task for host managed_node1 15500 1727096206.96153: ^ task is: None 15500 1727096206.96154: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.96206: in VariableManager get_vars() 15500 1727096206.96222: done with get_vars() 15500 1727096206.96228: in VariableManager get_vars() 15500 1727096206.96246: done with get_vars() 15500 1727096206.96253: variable 'omit' from source: magic vars 15500 1727096206.96289: in VariableManager get_vars() 15500 1727096206.96308: done with get_vars() 15500 1727096206.96331: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 15500 1727096206.96861: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096206.96894: getting the remaining hosts for this loop 15500 1727096206.96895: done getting the remaining hosts for this loop 15500 1727096206.96898: getting the next task for host managed_node1 15500 1727096206.96901: done getting next task for host managed_node1 15500 1727096206.96903: ^ task is: TASK: Gathering Facts 15500 1727096206.96908: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096206.96910: getting variables 15500 1727096206.96911: in VariableManager get_vars() 15500 1727096206.96922: Calling all_inventory to load vars for managed_node1 15500 1727096206.96925: Calling groups_inventory to load vars for managed_node1 15500 1727096206.96927: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096206.96934: Calling all_plugins_play to load vars for managed_node1 15500 1727096206.96936: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096206.96942: Calling groups_plugins_play to load vars for managed_node1 15500 1727096206.97071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096206.97206: done with get_vars() 15500 1727096206.97216: done getting variables 15500 1727096206.97270: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Monday 23 September 2024 08:56:46 -0400 (0:00:00.046) 0:00:07.016 ****** 15500 1727096206.97288: entering _queue_task() for managed_node1/gather_facts 15500 1727096206.97529: worker is 1 (out of 1 available) 15500 1727096206.97543: exiting _queue_task() for managed_node1/gather_facts 15500 1727096206.97555: done queuing things up, now waiting for results queue to drain 15500 1727096206.97560: waiting for pending results... 15500 1727096206.97715: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096206.97776: in run() - task 0afff68d-5257-877d-2da0-00000000014c 15500 1727096206.97791: variable 'ansible_search_path' from source: unknown 15500 1727096206.97821: calling self._execute() 15500 1727096206.97888: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.97892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.97913: variable 'omit' from source: magic vars 15500 1727096206.98194: variable 'ansible_distribution_major_version' from source: facts 15500 1727096206.98203: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096206.98208: variable 'omit' from source: magic vars 15500 1727096206.98227: variable 'omit' from source: magic vars 15500 1727096206.98255: variable 'omit' from source: magic vars 15500 1727096206.98289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096206.98315: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096206.98333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096206.98348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.98359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096206.98382: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096206.98385: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.98388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.98460: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096206.98464: Set connection var ansible_pipelining to False 15500 1727096206.98468: Set connection var ansible_timeout to 10 15500 1727096206.98471: Set connection var ansible_shell_type to sh 15500 1727096206.98476: Set connection var ansible_shell_executable to /bin/sh 15500 1727096206.98481: Set connection var ansible_connection to ssh 15500 1727096206.98498: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.98501: variable 'ansible_connection' from source: unknown 15500 1727096206.98504: variable 'ansible_module_compression' from source: unknown 15500 1727096206.98506: variable 'ansible_shell_type' from source: unknown 15500 1727096206.98509: variable 'ansible_shell_executable' from source: unknown 15500 1727096206.98511: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096206.98513: variable 'ansible_pipelining' from source: unknown 15500 1727096206.98517: variable 'ansible_timeout' from source: unknown 15500 1727096206.98520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096206.98742: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096206.98746: variable 'omit' from source: magic vars 15500 1727096206.98749: starting attempt loop 15500 1727096206.98751: running the handler 15500 1727096206.98754: variable 'ansible_facts' from source: unknown 15500 1727096206.98773: _low_level_execute_command(): starting 15500 1727096206.98776: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096206.99406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.99424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096206.99462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096206.99477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096206.99563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096207.01319: stdout chunk (state=3): >>>/root <<< 15500 1727096207.01437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096207.01464: stderr chunk (state=3): >>><<< 15500 1727096207.01469: stdout chunk (state=3): >>><<< 15500 1727096207.01612: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096207.01616: _low_level_execute_command(): starting 15500 1727096207.01619: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315 `" && echo ansible-tmp-1727096207.0151172-15838-48527669719315="` echo /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315 `" ) && sleep 0' 15500 1727096207.02385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096207.02407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096207.02425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096207.02454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096207.02561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096207.02590: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096207.02643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096207.02683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096207.02705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096207.02838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096207.04884: stdout chunk (state=3): >>>ansible-tmp-1727096207.0151172-15838-48527669719315=/root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315 <<< 15500 1727096207.05261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096207.05264: stdout chunk (state=3): >>><<< 15500 1727096207.05266: stderr chunk (state=3): >>><<< 15500 1727096207.05271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096207.0151172-15838-48527669719315=/root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096207.05274: variable 'ansible_module_compression' from source: unknown 15500 1727096207.05592: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096207.05595: variable 'ansible_facts' from source: unknown 15500 1727096207.05879: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/AnsiballZ_setup.py 15500 1727096207.06186: Sending initial data 15500 1727096207.06238: Sent initial data (153 bytes) 15500 1727096207.06801: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096207.06817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096207.06886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096207.06974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096207.07010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096207.07064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096207.07171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096207.08853: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096207.08993: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096207.09059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpb2vslna9 /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/AnsiballZ_setup.py <<< 15500 1727096207.09096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/AnsiballZ_setup.py" <<< 15500 1727096207.09142: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpb2vslna9" to remote "/root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/AnsiballZ_setup.py" <<< 15500 1727096207.10993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096207.11016: stderr chunk (state=3): >>><<< 15500 1727096207.11160: stdout chunk (state=3): >>><<< 15500 1727096207.11163: done transferring module to remote 15500 1727096207.11166: _low_level_execute_command(): starting 15500 1727096207.11176: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/ /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/AnsiballZ_setup.py && sleep 0' 15500 1727096207.11730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096207.11786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096207.11854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096207.11875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096207.11964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096207.12035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096207.13961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096207.13985: stderr chunk (state=3): >>><<< 15500 1727096207.13988: stdout chunk (state=3): >>><<< 15500 1727096207.14021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096207.14024: _low_level_execute_command(): starting 15500 1727096207.14027: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/AnsiballZ_setup.py && sleep 0' 15500 1727096207.14597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096207.14601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096207.14603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096207.14605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096207.14661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096207.14672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096207.14677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096207.14739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096207.80161: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "47", "epoch": "1727096207", "epoch_int": "1727096207", "date": "2024-09-23", "time": "08:56:47", "iso8601_micro": "2024-09-23T12:56:47.430537Z", "iso8601": "2024-09-23T12:56:47Z", "iso8601_basic": "20240923T085647430537", "iso8601_basic_short": "20240923T085647", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.53759765625, "5m": 0.33642578125, "15m": 0.1533203125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2954, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 577, "free": 2954}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 360, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797658624, "block_size": 4096, "block_total": 65519099, "block_available": 63915444, "block_used": 1603655, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096207.82077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096207.82109: stderr chunk (state=3): >>><<< 15500 1727096207.82112: stdout chunk (state=3): >>><<< 15500 1727096207.82376: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "47", "epoch": "1727096207", "epoch_int": "1727096207", "date": "2024-09-23", "time": "08:56:47", "iso8601_micro": "2024-09-23T12:56:47.430537Z", "iso8601": "2024-09-23T12:56:47Z", "iso8601_basic": "20240923T085647430537", "iso8601_basic_short": "20240923T085647", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_loadavg": {"1m": 0.53759765625, "5m": 0.33642578125, "15m": 0.1533203125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2954, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 577, "free": 2954}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 360, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797658624, "block_size": 4096, "block_total": 65519099, "block_available": 63915444, "block_used": 1603655, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096207.82896: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096207.82929: _low_level_execute_command(): starting 15500 1727096207.83057: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096207.0151172-15838-48527669719315/ > /dev/null 2>&1 && sleep 0' 15500 1727096207.84170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096207.84178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096207.84181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096207.84183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096207.84185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096207.84494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096207.84511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096207.86459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096207.86760: stderr chunk (state=3): >>><<< 15500 1727096207.86764: stdout chunk (state=3): >>><<< 15500 1727096207.86769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096207.86771: handler run complete 15500 1727096207.86854: variable 'ansible_facts' from source: unknown 15500 1727096207.87105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096207.87713: variable 'ansible_facts' from source: unknown 15500 1727096207.87889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096207.88339: attempt loop complete, returning result 15500 1727096207.88342: _execute() done 15500 1727096207.88344: dumping result to json 15500 1727096207.88346: done dumping result, returning 15500 1727096207.88347: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-00000000014c] 15500 1727096207.88349: sending task result for task 0afff68d-5257-877d-2da0-00000000014c 15500 1727096207.89211: done sending task result for task 0afff68d-5257-877d-2da0-00000000014c 15500 1727096207.89215: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096207.89708: no more pending results, returning what we have 15500 1727096207.89712: results queue empty 15500 1727096207.89713: checking for any_errors_fatal 15500 1727096207.89714: done checking for any_errors_fatal 15500 1727096207.89715: checking for max_fail_percentage 15500 1727096207.89717: done checking for max_fail_percentage 15500 1727096207.89717: checking to see if all hosts have failed and the running result is not ok 15500 1727096207.89718: done checking to see if all hosts have failed 15500 1727096207.89719: getting the remaining hosts for this loop 15500 1727096207.89720: done getting the remaining hosts for this loop 15500 1727096207.89724: getting the next task for host managed_node1 15500 1727096207.89729: done getting next task for host managed_node1 15500 1727096207.89731: ^ task is: TASK: meta (flush_handlers) 15500 1727096207.89733: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096207.89737: getting variables 15500 1727096207.89738: in VariableManager get_vars() 15500 1727096207.89766: Calling all_inventory to load vars for managed_node1 15500 1727096207.89771: Calling groups_inventory to load vars for managed_node1 15500 1727096207.89774: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096207.89784: Calling all_plugins_play to load vars for managed_node1 15500 1727096207.89786: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096207.89789: Calling groups_plugins_play to load vars for managed_node1 15500 1727096207.90271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096207.90760: done with get_vars() 15500 1727096207.90776: done getting variables 15500 1727096207.90961: in VariableManager get_vars() 15500 1727096207.90977: Calling all_inventory to load vars for managed_node1 15500 1727096207.90980: Calling groups_inventory to load vars for managed_node1 15500 1727096207.90982: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096207.90987: Calling all_plugins_play to load vars for managed_node1 15500 1727096207.90989: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096207.90991: Calling groups_plugins_play to load vars for managed_node1 15500 1727096207.91245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096207.91658: done with get_vars() 15500 1727096207.91726: done queuing things up, now waiting for results queue to drain 15500 1727096207.91729: results queue empty 15500 1727096207.91730: checking for any_errors_fatal 15500 1727096207.91735: done checking for any_errors_fatal 15500 1727096207.91738: checking for max_fail_percentage 15500 1727096207.91739: done checking for max_fail_percentage 15500 1727096207.91744: checking to see if all hosts have failed and the running result is not ok 15500 1727096207.91745: done checking to see if all hosts have failed 15500 1727096207.91745: getting the remaining hosts for this loop 15500 1727096207.91746: done getting the remaining hosts for this loop 15500 1727096207.91750: getting the next task for host managed_node1 15500 1727096207.91754: done getting next task for host managed_node1 15500 1727096207.91757: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15500 1727096207.91759: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096207.91771: getting variables 15500 1727096207.91772: in VariableManager get_vars() 15500 1727096207.91786: Calling all_inventory to load vars for managed_node1 15500 1727096207.91789: Calling groups_inventory to load vars for managed_node1 15500 1727096207.91790: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096207.91796: Calling all_plugins_play to load vars for managed_node1 15500 1727096207.91798: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096207.91800: Calling groups_plugins_play to load vars for managed_node1 15500 1727096207.92166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096207.92513: done with get_vars() 15500 1727096207.92523: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:47 -0400 (0:00:00.953) 0:00:07.969 ****** 15500 1727096207.92608: entering _queue_task() for managed_node1/include_tasks 15500 1727096207.92983: worker is 1 (out of 1 available) 15500 1727096207.92995: exiting _queue_task() for managed_node1/include_tasks 15500 1727096207.93008: done queuing things up, now waiting for results queue to drain 15500 1727096207.93009: waiting for pending results... 15500 1727096207.93313: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15500 1727096207.93429: in run() - task 0afff68d-5257-877d-2da0-000000000014 15500 1727096207.93450: variable 'ansible_search_path' from source: unknown 15500 1727096207.93465: variable 'ansible_search_path' from source: unknown 15500 1727096207.93514: calling self._execute() 15500 1727096207.93605: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096207.93675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096207.93680: variable 'omit' from source: magic vars 15500 1727096207.94039: variable 'ansible_distribution_major_version' from source: facts 15500 1727096207.94049: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096207.94054: _execute() done 15500 1727096207.94057: dumping result to json 15500 1727096207.94063: done dumping result, returning 15500 1727096207.94072: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-877d-2da0-000000000014] 15500 1727096207.94077: sending task result for task 0afff68d-5257-877d-2da0-000000000014 15500 1727096207.94184: done sending task result for task 0afff68d-5257-877d-2da0-000000000014 15500 1727096207.94188: WORKER PROCESS EXITING 15500 1727096207.94227: no more pending results, returning what we have 15500 1727096207.94232: in VariableManager get_vars() 15500 1727096207.94275: Calling all_inventory to load vars for managed_node1 15500 1727096207.94278: Calling groups_inventory to load vars for managed_node1 15500 1727096207.94280: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096207.94292: Calling all_plugins_play to load vars for managed_node1 15500 1727096207.94294: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096207.94301: Calling groups_plugins_play to load vars for managed_node1 15500 1727096207.94469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096207.94655: done with get_vars() 15500 1727096207.94662: variable 'ansible_search_path' from source: unknown 15500 1727096207.94663: variable 'ansible_search_path' from source: unknown 15500 1727096207.94690: we have included files to process 15500 1727096207.94691: generating all_blocks data 15500 1727096207.94692: done generating all_blocks data 15500 1727096207.94693: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096207.94694: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096207.94696: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096207.95174: done processing included file 15500 1727096207.95176: iterating over new_blocks loaded from include file 15500 1727096207.95177: in VariableManager get_vars() 15500 1727096207.95190: done with get_vars() 15500 1727096207.95191: filtering new block on tags 15500 1727096207.95201: done filtering new block on tags 15500 1727096207.95203: in VariableManager get_vars() 15500 1727096207.95213: done with get_vars() 15500 1727096207.95214: filtering new block on tags 15500 1727096207.95224: done filtering new block on tags 15500 1727096207.95225: in VariableManager get_vars() 15500 1727096207.95236: done with get_vars() 15500 1727096207.95236: filtering new block on tags 15500 1727096207.95245: done filtering new block on tags 15500 1727096207.95246: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15500 1727096207.95249: extending task lists for all hosts with included blocks 15500 1727096207.95481: done extending task lists 15500 1727096207.95483: done processing included files 15500 1727096207.95484: results queue empty 15500 1727096207.95484: checking for any_errors_fatal 15500 1727096207.95486: done checking for any_errors_fatal 15500 1727096207.95486: checking for max_fail_percentage 15500 1727096207.95487: done checking for max_fail_percentage 15500 1727096207.95488: checking to see if all hosts have failed and the running result is not ok 15500 1727096207.95489: done checking to see if all hosts have failed 15500 1727096207.95489: getting the remaining hosts for this loop 15500 1727096207.95491: done getting the remaining hosts for this loop 15500 1727096207.95493: getting the next task for host managed_node1 15500 1727096207.95496: done getting next task for host managed_node1 15500 1727096207.95499: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15500 1727096207.95501: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096207.95509: getting variables 15500 1727096207.95510: in VariableManager get_vars() 15500 1727096207.95522: Calling all_inventory to load vars for managed_node1 15500 1727096207.95524: Calling groups_inventory to load vars for managed_node1 15500 1727096207.95525: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096207.95530: Calling all_plugins_play to load vars for managed_node1 15500 1727096207.95532: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096207.95534: Calling groups_plugins_play to load vars for managed_node1 15500 1727096207.95692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096207.95903: done with get_vars() 15500 1727096207.95912: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:47 -0400 (0:00:00.033) 0:00:08.002 ****** 15500 1727096207.95978: entering _queue_task() for managed_node1/setup 15500 1727096207.96297: worker is 1 (out of 1 available) 15500 1727096207.96311: exiting _queue_task() for managed_node1/setup 15500 1727096207.96323: done queuing things up, now waiting for results queue to drain 15500 1727096207.96324: waiting for pending results... 15500 1727096207.96659: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15500 1727096207.96692: in run() - task 0afff68d-5257-877d-2da0-00000000018d 15500 1727096207.96720: variable 'ansible_search_path' from source: unknown 15500 1727096207.96729: variable 'ansible_search_path' from source: unknown 15500 1727096207.96751: calling self._execute() 15500 1727096207.96901: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096207.96907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096207.96910: variable 'omit' from source: magic vars 15500 1727096207.97373: variable 'ansible_distribution_major_version' from source: facts 15500 1727096207.97376: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096207.97485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096207.99136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096207.99187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096207.99215: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096207.99239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096207.99263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096207.99323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096207.99343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096207.99369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096207.99393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096207.99404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096207.99442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096207.99459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096207.99481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096207.99506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096207.99516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096207.99625: variable '__network_required_facts' from source: role '' defaults 15500 1727096207.99633: variable 'ansible_facts' from source: unknown 15500 1727096207.99700: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15500 1727096207.99704: when evaluation is False, skipping this task 15500 1727096207.99707: _execute() done 15500 1727096207.99710: dumping result to json 15500 1727096207.99712: done dumping result, returning 15500 1727096207.99719: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-877d-2da0-00000000018d] 15500 1727096207.99723: sending task result for task 0afff68d-5257-877d-2da0-00000000018d 15500 1727096207.99812: done sending task result for task 0afff68d-5257-877d-2da0-00000000018d 15500 1727096207.99815: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096207.99886: no more pending results, returning what we have 15500 1727096207.99889: results queue empty 15500 1727096207.99890: checking for any_errors_fatal 15500 1727096207.99891: done checking for any_errors_fatal 15500 1727096207.99892: checking for max_fail_percentage 15500 1727096207.99893: done checking for max_fail_percentage 15500 1727096207.99894: checking to see if all hosts have failed and the running result is not ok 15500 1727096207.99895: done checking to see if all hosts have failed 15500 1727096207.99895: getting the remaining hosts for this loop 15500 1727096207.99897: done getting the remaining hosts for this loop 15500 1727096207.99900: getting the next task for host managed_node1 15500 1727096207.99910: done getting next task for host managed_node1 15500 1727096207.99913: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15500 1727096207.99916: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096207.99928: getting variables 15500 1727096207.99930: in VariableManager get_vars() 15500 1727096207.99967: Calling all_inventory to load vars for managed_node1 15500 1727096207.99971: Calling groups_inventory to load vars for managed_node1 15500 1727096207.99973: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096207.99982: Calling all_plugins_play to load vars for managed_node1 15500 1727096207.99985: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096207.99987: Calling groups_plugins_play to load vars for managed_node1 15500 1727096208.00153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096208.00382: done with get_vars() 15500 1727096208.00394: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:48 -0400 (0:00:00.045) 0:00:08.048 ****** 15500 1727096208.00493: entering _queue_task() for managed_node1/stat 15500 1727096208.00778: worker is 1 (out of 1 available) 15500 1727096208.00791: exiting _queue_task() for managed_node1/stat 15500 1727096208.00806: done queuing things up, now waiting for results queue to drain 15500 1727096208.00807: waiting for pending results... 15500 1727096208.01097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15500 1727096208.01165: in run() - task 0afff68d-5257-877d-2da0-00000000018f 15500 1727096208.01183: variable 'ansible_search_path' from source: unknown 15500 1727096208.01187: variable 'ansible_search_path' from source: unknown 15500 1727096208.01211: calling self._execute() 15500 1727096208.01283: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096208.01288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096208.01297: variable 'omit' from source: magic vars 15500 1727096208.01556: variable 'ansible_distribution_major_version' from source: facts 15500 1727096208.01569: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096208.01696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096208.01946: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096208.01982: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096208.02006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096208.02029: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096208.02097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096208.02114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096208.02133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096208.02151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096208.02217: variable '__network_is_ostree' from source: set_fact 15500 1727096208.02223: Evaluated conditional (not __network_is_ostree is defined): False 15500 1727096208.02226: when evaluation is False, skipping this task 15500 1727096208.02228: _execute() done 15500 1727096208.02231: dumping result to json 15500 1727096208.02234: done dumping result, returning 15500 1727096208.02243: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-877d-2da0-00000000018f] 15500 1727096208.02246: sending task result for task 0afff68d-5257-877d-2da0-00000000018f 15500 1727096208.02329: done sending task result for task 0afff68d-5257-877d-2da0-00000000018f 15500 1727096208.02332: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15500 1727096208.02401: no more pending results, returning what we have 15500 1727096208.02405: results queue empty 15500 1727096208.02406: checking for any_errors_fatal 15500 1727096208.02412: done checking for any_errors_fatal 15500 1727096208.02413: checking for max_fail_percentage 15500 1727096208.02414: done checking for max_fail_percentage 15500 1727096208.02415: checking to see if all hosts have failed and the running result is not ok 15500 1727096208.02416: done checking to see if all hosts have failed 15500 1727096208.02417: getting the remaining hosts for this loop 15500 1727096208.02418: done getting the remaining hosts for this loop 15500 1727096208.02421: getting the next task for host managed_node1 15500 1727096208.02427: done getting next task for host managed_node1 15500 1727096208.02430: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15500 1727096208.02433: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096208.02447: getting variables 15500 1727096208.02448: in VariableManager get_vars() 15500 1727096208.02480: Calling all_inventory to load vars for managed_node1 15500 1727096208.02482: Calling groups_inventory to load vars for managed_node1 15500 1727096208.02484: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096208.02492: Calling all_plugins_play to load vars for managed_node1 15500 1727096208.02494: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096208.02496: Calling groups_plugins_play to load vars for managed_node1 15500 1727096208.02651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096208.02772: done with get_vars() 15500 1727096208.02779: done getting variables 15500 1727096208.02819: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:48 -0400 (0:00:00.023) 0:00:08.071 ****** 15500 1727096208.02842: entering _queue_task() for managed_node1/set_fact 15500 1727096208.03102: worker is 1 (out of 1 available) 15500 1727096208.03113: exiting _queue_task() for managed_node1/set_fact 15500 1727096208.03125: done queuing things up, now waiting for results queue to drain 15500 1727096208.03127: waiting for pending results... 15500 1727096208.03508: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15500 1727096208.03530: in run() - task 0afff68d-5257-877d-2da0-000000000190 15500 1727096208.03555: variable 'ansible_search_path' from source: unknown 15500 1727096208.03567: variable 'ansible_search_path' from source: unknown 15500 1727096208.03613: calling self._execute() 15500 1727096208.03707: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096208.03722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096208.03739: variable 'omit' from source: magic vars 15500 1727096208.04140: variable 'ansible_distribution_major_version' from source: facts 15500 1727096208.04164: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096208.04334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096208.04624: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096208.04683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096208.04774: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096208.04777: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096208.04866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096208.04908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096208.04945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096208.05005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096208.05080: variable '__network_is_ostree' from source: set_fact 15500 1727096208.05097: Evaluated conditional (not __network_is_ostree is defined): False 15500 1727096208.05169: when evaluation is False, skipping this task 15500 1727096208.05174: _execute() done 15500 1727096208.05177: dumping result to json 15500 1727096208.05180: done dumping result, returning 15500 1727096208.05184: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-877d-2da0-000000000190] 15500 1727096208.05187: sending task result for task 0afff68d-5257-877d-2da0-000000000190 15500 1727096208.05259: done sending task result for task 0afff68d-5257-877d-2da0-000000000190 15500 1727096208.05262: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15500 1727096208.05308: no more pending results, returning what we have 15500 1727096208.05312: results queue empty 15500 1727096208.05312: checking for any_errors_fatal 15500 1727096208.05318: done checking for any_errors_fatal 15500 1727096208.05318: checking for max_fail_percentage 15500 1727096208.05320: done checking for max_fail_percentage 15500 1727096208.05321: checking to see if all hosts have failed and the running result is not ok 15500 1727096208.05322: done checking to see if all hosts have failed 15500 1727096208.05323: getting the remaining hosts for this loop 15500 1727096208.05324: done getting the remaining hosts for this loop 15500 1727096208.05329: getting the next task for host managed_node1 15500 1727096208.05339: done getting next task for host managed_node1 15500 1727096208.05344: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15500 1727096208.05348: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096208.05365: getting variables 15500 1727096208.05370: in VariableManager get_vars() 15500 1727096208.05417: Calling all_inventory to load vars for managed_node1 15500 1727096208.05420: Calling groups_inventory to load vars for managed_node1 15500 1727096208.05423: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096208.05435: Calling all_plugins_play to load vars for managed_node1 15500 1727096208.05438: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096208.05443: Calling groups_plugins_play to load vars for managed_node1 15500 1727096208.05845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096208.06139: done with get_vars() 15500 1727096208.06149: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:48 -0400 (0:00:00.034) 0:00:08.105 ****** 15500 1727096208.06257: entering _queue_task() for managed_node1/service_facts 15500 1727096208.06259: Creating lock for service_facts 15500 1727096208.06805: worker is 1 (out of 1 available) 15500 1727096208.06813: exiting _queue_task() for managed_node1/service_facts 15500 1727096208.06828: done queuing things up, now waiting for results queue to drain 15500 1727096208.06829: waiting for pending results... 15500 1727096208.07007: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15500 1727096208.07054: in run() - task 0afff68d-5257-877d-2da0-000000000192 15500 1727096208.07062: variable 'ansible_search_path' from source: unknown 15500 1727096208.07066: variable 'ansible_search_path' from source: unknown 15500 1727096208.07122: calling self._execute() 15500 1727096208.07127: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096208.07130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096208.07144: variable 'omit' from source: magic vars 15500 1727096208.07512: variable 'ansible_distribution_major_version' from source: facts 15500 1727096208.07516: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096208.07518: variable 'omit' from source: magic vars 15500 1727096208.07571: variable 'omit' from source: magic vars 15500 1727096208.07595: variable 'omit' from source: magic vars 15500 1727096208.07629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096208.07664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096208.07816: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096208.07819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096208.07822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096208.07825: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096208.07827: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096208.07830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096208.07942: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096208.07946: Set connection var ansible_pipelining to False 15500 1727096208.07948: Set connection var ansible_timeout to 10 15500 1727096208.07950: Set connection var ansible_shell_type to sh 15500 1727096208.07952: Set connection var ansible_shell_executable to /bin/sh 15500 1727096208.07955: Set connection var ansible_connection to ssh 15500 1727096208.07957: variable 'ansible_shell_executable' from source: unknown 15500 1727096208.07959: variable 'ansible_connection' from source: unknown 15500 1727096208.07962: variable 'ansible_module_compression' from source: unknown 15500 1727096208.07965: variable 'ansible_shell_type' from source: unknown 15500 1727096208.07971: variable 'ansible_shell_executable' from source: unknown 15500 1727096208.07973: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096208.07977: variable 'ansible_pipelining' from source: unknown 15500 1727096208.07979: variable 'ansible_timeout' from source: unknown 15500 1727096208.07985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096208.08135: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096208.08146: variable 'omit' from source: magic vars 15500 1727096208.08150: starting attempt loop 15500 1727096208.08153: running the handler 15500 1727096208.08169: _low_level_execute_command(): starting 15500 1727096208.08176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096208.08801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096208.08806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096208.08809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096208.08850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096208.08861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096208.08875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096208.08955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096208.10702: stdout chunk (state=3): >>>/root <<< 15500 1727096208.10801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096208.10840: stderr chunk (state=3): >>><<< 15500 1727096208.10843: stdout chunk (state=3): >>><<< 15500 1727096208.10861: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096208.10938: _low_level_execute_command(): starting 15500 1727096208.10942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707 `" && echo ansible-tmp-1727096208.108692-15896-16036544942707="` echo /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707 `" ) && sleep 0' 15500 1727096208.11373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096208.11379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096208.11437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096208.11514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096208.11518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096208.11577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096208.11652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096208.13675: stdout chunk (state=3): >>>ansible-tmp-1727096208.108692-15896-16036544942707=/root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707 <<< 15500 1727096208.13771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096208.13820: stderr chunk (state=3): >>><<< 15500 1727096208.13824: stdout chunk (state=3): >>><<< 15500 1727096208.13839: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096208.108692-15896-16036544942707=/root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096208.13889: variable 'ansible_module_compression' from source: unknown 15500 1727096208.13986: ANSIBALLZ: Using lock for service_facts 15500 1727096208.13989: ANSIBALLZ: Acquiring lock 15500 1727096208.13991: ANSIBALLZ: Lock acquired: 140712178943808 15500 1727096208.13993: ANSIBALLZ: Creating module 15500 1727096208.25034: ANSIBALLZ: Writing module into payload 15500 1727096208.25100: ANSIBALLZ: Writing module 15500 1727096208.25119: ANSIBALLZ: Renaming module 15500 1727096208.25130: ANSIBALLZ: Done creating module 15500 1727096208.25146: variable 'ansible_facts' from source: unknown 15500 1727096208.25196: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/AnsiballZ_service_facts.py 15500 1727096208.25298: Sending initial data 15500 1727096208.25301: Sent initial data (160 bytes) 15500 1727096208.26311: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096208.26326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096208.26341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096208.26448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096208.28162: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096208.28196: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096208.28296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096208.28377: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp5mxqosxy /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/AnsiballZ_service_facts.py <<< 15500 1727096208.28397: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/AnsiballZ_service_facts.py" <<< 15500 1727096208.28447: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp5mxqosxy" to remote "/root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/AnsiballZ_service_facts.py" <<< 15500 1727096208.29688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096208.29722: stderr chunk (state=3): >>><<< 15500 1727096208.29736: stdout chunk (state=3): >>><<< 15500 1727096208.29772: done transferring module to remote 15500 1727096208.29865: _low_level_execute_command(): starting 15500 1727096208.29870: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/ /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/AnsiballZ_service_facts.py && sleep 0' 15500 1727096208.30655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096208.30673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096208.30748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096208.30814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096208.32833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096208.32837: stdout chunk (state=3): >>><<< 15500 1727096208.32840: stderr chunk (state=3): >>><<< 15500 1727096208.32856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096208.32950: _low_level_execute_command(): starting 15500 1727096208.32954: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/AnsiballZ_service_facts.py && sleep 0' 15500 1727096208.33546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096208.33571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096208.33590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096208.33700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096209.96754: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 15500 1727096209.96784: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 15500 1727096209.96794: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15500 1727096209.98409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096209.98435: stderr chunk (state=3): >>><<< 15500 1727096209.98438: stdout chunk (state=3): >>><<< 15500 1727096209.98464: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096209.98837: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096209.98845: _low_level_execute_command(): starting 15500 1727096209.98850: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096208.108692-15896-16036544942707/ > /dev/null 2>&1 && sleep 0' 15500 1727096209.99327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096209.99331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096209.99334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096209.99336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096209.99393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096209.99397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096209.99399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096209.99476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096210.01364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096210.01392: stderr chunk (state=3): >>><<< 15500 1727096210.01396: stdout chunk (state=3): >>><<< 15500 1727096210.01416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096210.01419: handler run complete 15500 1727096210.01536: variable 'ansible_facts' from source: unknown 15500 1727096210.01631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096210.01890: variable 'ansible_facts' from source: unknown 15500 1727096210.01975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096210.02089: attempt loop complete, returning result 15500 1727096210.02093: _execute() done 15500 1727096210.02095: dumping result to json 15500 1727096210.02129: done dumping result, returning 15500 1727096210.02138: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-877d-2da0-000000000192] 15500 1727096210.02140: sending task result for task 0afff68d-5257-877d-2da0-000000000192 15500 1727096210.02853: done sending task result for task 0afff68d-5257-877d-2da0-000000000192 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096210.02905: no more pending results, returning what we have 15500 1727096210.02909: results queue empty 15500 1727096210.02910: checking for any_errors_fatal 15500 1727096210.02913: done checking for any_errors_fatal 15500 1727096210.02914: checking for max_fail_percentage 15500 1727096210.02915: done checking for max_fail_percentage 15500 1727096210.02916: checking to see if all hosts have failed and the running result is not ok 15500 1727096210.02917: done checking to see if all hosts have failed 15500 1727096210.02917: getting the remaining hosts for this loop 15500 1727096210.02919: done getting the remaining hosts for this loop 15500 1727096210.02927: getting the next task for host managed_node1 15500 1727096210.02932: done getting next task for host managed_node1 15500 1727096210.02935: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15500 1727096210.02938: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096210.02949: getting variables 15500 1727096210.02950: in VariableManager get_vars() 15500 1727096210.02986: Calling all_inventory to load vars for managed_node1 15500 1727096210.02989: Calling groups_inventory to load vars for managed_node1 15500 1727096210.02993: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096210.03002: Calling all_plugins_play to load vars for managed_node1 15500 1727096210.03005: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096210.03013: WORKER PROCESS EXITING 15500 1727096210.03033: Calling groups_plugins_play to load vars for managed_node1 15500 1727096210.03413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096210.03850: done with get_vars() 15500 1727096210.03869: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:50 -0400 (0:00:01.977) 0:00:10.082 ****** 15500 1727096210.03965: entering _queue_task() for managed_node1/package_facts 15500 1727096210.03969: Creating lock for package_facts 15500 1727096210.04309: worker is 1 (out of 1 available) 15500 1727096210.04320: exiting _queue_task() for managed_node1/package_facts 15500 1727096210.04335: done queuing things up, now waiting for results queue to drain 15500 1727096210.04336: waiting for pending results... 15500 1727096210.04614: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15500 1727096210.04774: in run() - task 0afff68d-5257-877d-2da0-000000000193 15500 1727096210.04778: variable 'ansible_search_path' from source: unknown 15500 1727096210.04781: variable 'ansible_search_path' from source: unknown 15500 1727096210.04813: calling self._execute() 15500 1727096210.04896: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096210.04904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096210.04912: variable 'omit' from source: magic vars 15500 1727096210.05209: variable 'ansible_distribution_major_version' from source: facts 15500 1727096210.05219: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096210.05227: variable 'omit' from source: magic vars 15500 1727096210.05269: variable 'omit' from source: magic vars 15500 1727096210.05294: variable 'omit' from source: magic vars 15500 1727096210.05327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096210.05357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096210.05378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096210.05391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096210.05400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096210.05426: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096210.05429: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096210.05432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096210.05510: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096210.05514: Set connection var ansible_pipelining to False 15500 1727096210.05519: Set connection var ansible_timeout to 10 15500 1727096210.05523: Set connection var ansible_shell_type to sh 15500 1727096210.05526: Set connection var ansible_shell_executable to /bin/sh 15500 1727096210.05534: Set connection var ansible_connection to ssh 15500 1727096210.05550: variable 'ansible_shell_executable' from source: unknown 15500 1727096210.05555: variable 'ansible_connection' from source: unknown 15500 1727096210.05558: variable 'ansible_module_compression' from source: unknown 15500 1727096210.05560: variable 'ansible_shell_type' from source: unknown 15500 1727096210.05562: variable 'ansible_shell_executable' from source: unknown 15500 1727096210.05565: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096210.05570: variable 'ansible_pipelining' from source: unknown 15500 1727096210.05573: variable 'ansible_timeout' from source: unknown 15500 1727096210.05578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096210.05725: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096210.05734: variable 'omit' from source: magic vars 15500 1727096210.05739: starting attempt loop 15500 1727096210.05742: running the handler 15500 1727096210.05756: _low_level_execute_command(): starting 15500 1727096210.05766: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096210.06282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096210.06289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096210.06311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096210.06359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096210.06366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096210.06371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096210.06436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096210.08155: stdout chunk (state=3): >>>/root <<< 15500 1727096210.08248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096210.08287: stderr chunk (state=3): >>><<< 15500 1727096210.08291: stdout chunk (state=3): >>><<< 15500 1727096210.08312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096210.08324: _low_level_execute_command(): starting 15500 1727096210.08331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132 `" && echo ansible-tmp-1727096210.0831227-15972-139864340529132="` echo /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132 `" ) && sleep 0' 15500 1727096210.08922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096210.08934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096210.08971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096210.08997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096210.09031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096210.09130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096210.11111: stdout chunk (state=3): >>>ansible-tmp-1727096210.0831227-15972-139864340529132=/root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132 <<< 15500 1727096210.11226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096210.11281: stderr chunk (state=3): >>><<< 15500 1727096210.11308: stdout chunk (state=3): >>><<< 15500 1727096210.11474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096210.0831227-15972-139864340529132=/root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096210.11481: variable 'ansible_module_compression' from source: unknown 15500 1727096210.11484: ANSIBALLZ: Using lock for package_facts 15500 1727096210.11511: ANSIBALLZ: Acquiring lock 15500 1727096210.11522: ANSIBALLZ: Lock acquired: 140712176845360 15500 1727096210.11529: ANSIBALLZ: Creating module 15500 1727096210.44130: ANSIBALLZ: Writing module into payload 15500 1727096210.44242: ANSIBALLZ: Writing module 15500 1727096210.44281: ANSIBALLZ: Renaming module 15500 1727096210.44286: ANSIBALLZ: Done creating module 15500 1727096210.44319: variable 'ansible_facts' from source: unknown 15500 1727096210.44443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/AnsiballZ_package_facts.py 15500 1727096210.44555: Sending initial data 15500 1727096210.44563: Sent initial data (162 bytes) 15500 1727096210.45260: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096210.45326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096210.45404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096210.47100: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096210.47162: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096210.47222: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpymi782qz /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/AnsiballZ_package_facts.py <<< 15500 1727096210.47228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/AnsiballZ_package_facts.py" <<< 15500 1727096210.47288: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpymi782qz" to remote "/root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/AnsiballZ_package_facts.py" <<< 15500 1727096210.47295: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/AnsiballZ_package_facts.py" <<< 15500 1727096210.48825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096210.48959: stderr chunk (state=3): >>><<< 15500 1727096210.48963: stdout chunk (state=3): >>><<< 15500 1727096210.48965: done transferring module to remote 15500 1727096210.48974: _low_level_execute_command(): starting 15500 1727096210.48977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/ /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/AnsiballZ_package_facts.py && sleep 0' 15500 1727096210.49544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096210.49548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096210.49550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096210.49553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096210.49555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096210.49560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096210.49562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096210.49573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096210.49618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096210.49622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096210.49624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096210.49705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096210.51676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096210.51680: stdout chunk (state=3): >>><<< 15500 1727096210.51683: stderr chunk (state=3): >>><<< 15500 1727096210.51685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096210.51688: _low_level_execute_command(): starting 15500 1727096210.51690: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/AnsiballZ_package_facts.py && sleep 0' 15500 1727096210.52310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096210.52327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096210.52331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096210.52370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096210.52375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096210.52378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096210.52380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096210.52429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096210.52439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096210.52452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096210.52522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096210.97266: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 15500 1727096210.97296: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 15500 1727096210.97310: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15500 1727096210.97398: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 15500 1727096210.97405: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15500 1727096210.99260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096210.99264: stderr chunk (state=3): >>><<< 15500 1727096210.99271: stdout chunk (state=3): >>><<< 15500 1727096210.99397: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096211.01896: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096211.01900: _low_level_execute_command(): starting 15500 1727096211.01905: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096210.0831227-15972-139864340529132/ > /dev/null 2>&1 && sleep 0' 15500 1727096211.02472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096211.02477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096211.02480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096211.02482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096211.02521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096211.02525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096211.02530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096211.02613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096211.04524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096211.04552: stderr chunk (state=3): >>><<< 15500 1727096211.04555: stdout chunk (state=3): >>><<< 15500 1727096211.04573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096211.04579: handler run complete 15500 1727096211.05045: variable 'ansible_facts' from source: unknown 15500 1727096211.05303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.07208: variable 'ansible_facts' from source: unknown 15500 1727096211.07596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.08117: attempt loop complete, returning result 15500 1727096211.08129: _execute() done 15500 1727096211.08132: dumping result to json 15500 1727096211.08295: done dumping result, returning 15500 1727096211.08305: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-877d-2da0-000000000193] 15500 1727096211.08314: sending task result for task 0afff68d-5257-877d-2da0-000000000193 15500 1727096211.09882: done sending task result for task 0afff68d-5257-877d-2da0-000000000193 15500 1727096211.09886: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096211.09933: no more pending results, returning what we have 15500 1727096211.09937: results queue empty 15500 1727096211.09937: checking for any_errors_fatal 15500 1727096211.09941: done checking for any_errors_fatal 15500 1727096211.09942: checking for max_fail_percentage 15500 1727096211.09944: done checking for max_fail_percentage 15500 1727096211.09944: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.09945: done checking to see if all hosts have failed 15500 1727096211.09945: getting the remaining hosts for this loop 15500 1727096211.09946: done getting the remaining hosts for this loop 15500 1727096211.09949: getting the next task for host managed_node1 15500 1727096211.09954: done getting next task for host managed_node1 15500 1727096211.09958: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15500 1727096211.09960: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.09969: getting variables 15500 1727096211.09970: in VariableManager get_vars() 15500 1727096211.09994: Calling all_inventory to load vars for managed_node1 15500 1727096211.09996: Calling groups_inventory to load vars for managed_node1 15500 1727096211.09997: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.10003: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.10005: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.10006: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.10777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.11835: done with get_vars() 15500 1727096211.11857: done getting variables 15500 1727096211.11905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:51 -0400 (0:00:01.079) 0:00:11.162 ****** 15500 1727096211.11927: entering _queue_task() for managed_node1/debug 15500 1727096211.12220: worker is 1 (out of 1 available) 15500 1727096211.12235: exiting _queue_task() for managed_node1/debug 15500 1727096211.12254: done queuing things up, now waiting for results queue to drain 15500 1727096211.12257: waiting for pending results... 15500 1727096211.12517: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15500 1727096211.12585: in run() - task 0afff68d-5257-877d-2da0-000000000015 15500 1727096211.12621: variable 'ansible_search_path' from source: unknown 15500 1727096211.12625: variable 'ansible_search_path' from source: unknown 15500 1727096211.12639: calling self._execute() 15500 1727096211.12733: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.12740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.12743: variable 'omit' from source: magic vars 15500 1727096211.13134: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.13138: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.13143: variable 'omit' from source: magic vars 15500 1727096211.13203: variable 'omit' from source: magic vars 15500 1727096211.13306: variable 'network_provider' from source: set_fact 15500 1727096211.13309: variable 'omit' from source: magic vars 15500 1727096211.13351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096211.13401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096211.13404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096211.13417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096211.13427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096211.13483: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096211.13489: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.13492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.13586: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096211.13589: Set connection var ansible_pipelining to False 15500 1727096211.13603: Set connection var ansible_timeout to 10 15500 1727096211.13610: Set connection var ansible_shell_type to sh 15500 1727096211.13613: Set connection var ansible_shell_executable to /bin/sh 15500 1727096211.13615: Set connection var ansible_connection to ssh 15500 1727096211.13673: variable 'ansible_shell_executable' from source: unknown 15500 1727096211.13677: variable 'ansible_connection' from source: unknown 15500 1727096211.13679: variable 'ansible_module_compression' from source: unknown 15500 1727096211.13681: variable 'ansible_shell_type' from source: unknown 15500 1727096211.13683: variable 'ansible_shell_executable' from source: unknown 15500 1727096211.13685: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.13687: variable 'ansible_pipelining' from source: unknown 15500 1727096211.13689: variable 'ansible_timeout' from source: unknown 15500 1727096211.13690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.13801: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096211.13809: variable 'omit' from source: magic vars 15500 1727096211.13814: starting attempt loop 15500 1727096211.13818: running the handler 15500 1727096211.13858: handler run complete 15500 1727096211.13872: attempt loop complete, returning result 15500 1727096211.13879: _execute() done 15500 1727096211.13883: dumping result to json 15500 1727096211.13886: done dumping result, returning 15500 1727096211.13889: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-877d-2da0-000000000015] 15500 1727096211.13937: sending task result for task 0afff68d-5257-877d-2da0-000000000015 15500 1727096211.14006: done sending task result for task 0afff68d-5257-877d-2da0-000000000015 15500 1727096211.14009: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 15500 1727096211.14065: no more pending results, returning what we have 15500 1727096211.14071: results queue empty 15500 1727096211.14072: checking for any_errors_fatal 15500 1727096211.14081: done checking for any_errors_fatal 15500 1727096211.14082: checking for max_fail_percentage 15500 1727096211.14083: done checking for max_fail_percentage 15500 1727096211.14084: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.14085: done checking to see if all hosts have failed 15500 1727096211.14085: getting the remaining hosts for this loop 15500 1727096211.14087: done getting the remaining hosts for this loop 15500 1727096211.14091: getting the next task for host managed_node1 15500 1727096211.14097: done getting next task for host managed_node1 15500 1727096211.14100: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15500 1727096211.14102: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.14111: getting variables 15500 1727096211.14113: in VariableManager get_vars() 15500 1727096211.14150: Calling all_inventory to load vars for managed_node1 15500 1727096211.14153: Calling groups_inventory to load vars for managed_node1 15500 1727096211.14155: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.14164: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.14166: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.14198: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.15010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.16353: done with get_vars() 15500 1727096211.16403: done getting variables 15500 1727096211.16497: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:51 -0400 (0:00:00.046) 0:00:11.208 ****** 15500 1727096211.16544: entering _queue_task() for managed_node1/fail 15500 1727096211.16546: Creating lock for fail 15500 1727096211.16850: worker is 1 (out of 1 available) 15500 1727096211.16867: exiting _queue_task() for managed_node1/fail 15500 1727096211.16882: done queuing things up, now waiting for results queue to drain 15500 1727096211.16883: waiting for pending results... 15500 1727096211.17209: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15500 1727096211.17214: in run() - task 0afff68d-5257-877d-2da0-000000000016 15500 1727096211.17218: variable 'ansible_search_path' from source: unknown 15500 1727096211.17220: variable 'ansible_search_path' from source: unknown 15500 1727096211.17575: calling self._execute() 15500 1727096211.17579: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.17583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.17585: variable 'omit' from source: magic vars 15500 1727096211.17863: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.17889: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.18024: variable 'network_state' from source: role '' defaults 15500 1727096211.18146: Evaluated conditional (network_state != {}): False 15500 1727096211.18150: when evaluation is False, skipping this task 15500 1727096211.18152: _execute() done 15500 1727096211.18154: dumping result to json 15500 1727096211.18159: done dumping result, returning 15500 1727096211.18162: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-877d-2da0-000000000016] 15500 1727096211.18165: sending task result for task 0afff68d-5257-877d-2da0-000000000016 15500 1727096211.18238: done sending task result for task 0afff68d-5257-877d-2da0-000000000016 15500 1727096211.18241: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096211.18300: no more pending results, returning what we have 15500 1727096211.18304: results queue empty 15500 1727096211.18305: checking for any_errors_fatal 15500 1727096211.18312: done checking for any_errors_fatal 15500 1727096211.18313: checking for max_fail_percentage 15500 1727096211.18315: done checking for max_fail_percentage 15500 1727096211.18315: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.18316: done checking to see if all hosts have failed 15500 1727096211.18317: getting the remaining hosts for this loop 15500 1727096211.18319: done getting the remaining hosts for this loop 15500 1727096211.18322: getting the next task for host managed_node1 15500 1727096211.18329: done getting next task for host managed_node1 15500 1727096211.18333: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15500 1727096211.18336: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.18351: getting variables 15500 1727096211.18353: in VariableManager get_vars() 15500 1727096211.18399: Calling all_inventory to load vars for managed_node1 15500 1727096211.18402: Calling groups_inventory to load vars for managed_node1 15500 1727096211.18405: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.18418: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.18421: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.18424: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.19436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.20298: done with get_vars() 15500 1727096211.20328: done getting variables 15500 1727096211.20397: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:51 -0400 (0:00:00.038) 0:00:11.247 ****** 15500 1727096211.20428: entering _queue_task() for managed_node1/fail 15500 1727096211.20775: worker is 1 (out of 1 available) 15500 1727096211.20788: exiting _queue_task() for managed_node1/fail 15500 1727096211.20800: done queuing things up, now waiting for results queue to drain 15500 1727096211.20802: waiting for pending results... 15500 1727096211.21187: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15500 1727096211.21191: in run() - task 0afff68d-5257-877d-2da0-000000000017 15500 1727096211.21200: variable 'ansible_search_path' from source: unknown 15500 1727096211.21206: variable 'ansible_search_path' from source: unknown 15500 1727096211.21245: calling self._execute() 15500 1727096211.21348: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.21364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.21382: variable 'omit' from source: magic vars 15500 1727096211.21755: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.21777: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.21905: variable 'network_state' from source: role '' defaults 15500 1727096211.21920: Evaluated conditional (network_state != {}): False 15500 1727096211.21927: when evaluation is False, skipping this task 15500 1727096211.21942: _execute() done 15500 1727096211.21949: dumping result to json 15500 1727096211.21960: done dumping result, returning 15500 1727096211.21977: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-877d-2da0-000000000017] 15500 1727096211.21987: sending task result for task 0afff68d-5257-877d-2da0-000000000017 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096211.22256: no more pending results, returning what we have 15500 1727096211.22263: results queue empty 15500 1727096211.22263: checking for any_errors_fatal 15500 1727096211.22273: done checking for any_errors_fatal 15500 1727096211.22274: checking for max_fail_percentage 15500 1727096211.22276: done checking for max_fail_percentage 15500 1727096211.22277: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.22277: done checking to see if all hosts have failed 15500 1727096211.22278: getting the remaining hosts for this loop 15500 1727096211.22280: done getting the remaining hosts for this loop 15500 1727096211.22284: getting the next task for host managed_node1 15500 1727096211.22292: done getting next task for host managed_node1 15500 1727096211.22296: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15500 1727096211.22299: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.22316: getting variables 15500 1727096211.22318: in VariableManager get_vars() 15500 1727096211.22365: Calling all_inventory to load vars for managed_node1 15500 1727096211.22562: Calling groups_inventory to load vars for managed_node1 15500 1727096211.22568: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.22576: done sending task result for task 0afff68d-5257-877d-2da0-000000000017 15500 1727096211.22579: WORKER PROCESS EXITING 15500 1727096211.22588: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.22591: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.22594: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.23935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.25407: done with get_vars() 15500 1727096211.25439: done getting variables 15500 1727096211.25490: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:51 -0400 (0:00:00.050) 0:00:11.298 ****** 15500 1727096211.25515: entering _queue_task() for managed_node1/fail 15500 1727096211.25775: worker is 1 (out of 1 available) 15500 1727096211.25790: exiting _queue_task() for managed_node1/fail 15500 1727096211.25803: done queuing things up, now waiting for results queue to drain 15500 1727096211.25804: waiting for pending results... 15500 1727096211.25973: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15500 1727096211.26027: in run() - task 0afff68d-5257-877d-2da0-000000000018 15500 1727096211.26041: variable 'ansible_search_path' from source: unknown 15500 1727096211.26044: variable 'ansible_search_path' from source: unknown 15500 1727096211.26072: calling self._execute() 15500 1727096211.26140: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.26144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.26156: variable 'omit' from source: magic vars 15500 1727096211.26426: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.26435: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.26560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096211.28726: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096211.28785: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096211.28813: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096211.28838: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096211.28857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096211.28922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.28943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.28963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.28995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.29006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.29080: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.29095: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15500 1727096211.29174: variable 'ansible_distribution' from source: facts 15500 1727096211.29178: variable '__network_rh_distros' from source: role '' defaults 15500 1727096211.29186: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15500 1727096211.29343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.29361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.29381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.29406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.29418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.29454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.29473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.29490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.29514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.29528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.29560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.29578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.29594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.29617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.29629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.29819: variable 'network_connections' from source: play vars 15500 1727096211.29828: variable 'interface' from source: set_fact 15500 1727096211.29884: variable 'interface' from source: set_fact 15500 1727096211.29891: variable 'interface' from source: set_fact 15500 1727096211.29934: variable 'interface' from source: set_fact 15500 1727096211.29942: variable 'network_state' from source: role '' defaults 15500 1727096211.29994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096211.30132: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096211.30372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096211.30376: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096211.30378: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096211.30380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096211.30391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096211.30422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.30454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096211.30511: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15500 1727096211.30520: when evaluation is False, skipping this task 15500 1727096211.30528: _execute() done 15500 1727096211.30535: dumping result to json 15500 1727096211.30543: done dumping result, returning 15500 1727096211.30559: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-877d-2da0-000000000018] 15500 1727096211.30574: sending task result for task 0afff68d-5257-877d-2da0-000000000018 skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15500 1727096211.30744: no more pending results, returning what we have 15500 1727096211.30748: results queue empty 15500 1727096211.30748: checking for any_errors_fatal 15500 1727096211.30753: done checking for any_errors_fatal 15500 1727096211.30754: checking for max_fail_percentage 15500 1727096211.30756: done checking for max_fail_percentage 15500 1727096211.30760: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.30761: done checking to see if all hosts have failed 15500 1727096211.30761: getting the remaining hosts for this loop 15500 1727096211.30763: done getting the remaining hosts for this loop 15500 1727096211.30770: getting the next task for host managed_node1 15500 1727096211.30777: done getting next task for host managed_node1 15500 1727096211.30782: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15500 1727096211.30784: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.30798: getting variables 15500 1727096211.30800: in VariableManager get_vars() 15500 1727096211.30843: Calling all_inventory to load vars for managed_node1 15500 1727096211.30846: Calling groups_inventory to load vars for managed_node1 15500 1727096211.30849: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.30864: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.31080: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.31087: done sending task result for task 0afff68d-5257-877d-2da0-000000000018 15500 1727096211.31090: WORKER PROCESS EXITING 15500 1727096211.31095: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.32203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.33065: done with get_vars() 15500 1727096211.33087: done getting variables 15500 1727096211.33194: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:51 -0400 (0:00:00.077) 0:00:11.375 ****** 15500 1727096211.33229: entering _queue_task() for managed_node1/dnf 15500 1727096211.33596: worker is 1 (out of 1 available) 15500 1727096211.33608: exiting _queue_task() for managed_node1/dnf 15500 1727096211.33620: done queuing things up, now waiting for results queue to drain 15500 1727096211.33622: waiting for pending results... 15500 1727096211.33910: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15500 1727096211.34075: in run() - task 0afff68d-5257-877d-2da0-000000000019 15500 1727096211.34079: variable 'ansible_search_path' from source: unknown 15500 1727096211.34082: variable 'ansible_search_path' from source: unknown 15500 1727096211.34085: calling self._execute() 15500 1727096211.34160: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.34173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.34186: variable 'omit' from source: magic vars 15500 1727096211.34547: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.34565: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.34752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096211.36412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096211.36463: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096211.36491: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096211.36517: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096211.36540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096211.36602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.36621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.36644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.36670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.36681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.36770: variable 'ansible_distribution' from source: facts 15500 1727096211.36774: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.36786: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15500 1727096211.36865: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096211.36948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.36966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.36991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.37016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.37026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.37172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.37175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.37177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.37179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.37181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.37213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.37240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.37269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.37311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.37330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.37508: variable 'network_connections' from source: play vars 15500 1727096211.37525: variable 'interface' from source: set_fact 15500 1727096211.37597: variable 'interface' from source: set_fact 15500 1727096211.37612: variable 'interface' from source: set_fact 15500 1727096211.37674: variable 'interface' from source: set_fact 15500 1727096211.37745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096211.37937: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096211.37984: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096211.38018: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096211.38051: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096211.38103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096211.38138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096211.38185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.38203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096211.38249: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096211.38420: variable 'network_connections' from source: play vars 15500 1727096211.38424: variable 'interface' from source: set_fact 15500 1727096211.38471: variable 'interface' from source: set_fact 15500 1727096211.38479: variable 'interface' from source: set_fact 15500 1727096211.38522: variable 'interface' from source: set_fact 15500 1727096211.38547: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096211.38551: when evaluation is False, skipping this task 15500 1727096211.38553: _execute() done 15500 1727096211.38555: dumping result to json 15500 1727096211.38560: done dumping result, returning 15500 1727096211.38567: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-877d-2da0-000000000019] 15500 1727096211.38573: sending task result for task 0afff68d-5257-877d-2da0-000000000019 15500 1727096211.38669: done sending task result for task 0afff68d-5257-877d-2da0-000000000019 15500 1727096211.38672: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096211.38747: no more pending results, returning what we have 15500 1727096211.38751: results queue empty 15500 1727096211.38751: checking for any_errors_fatal 15500 1727096211.38759: done checking for any_errors_fatal 15500 1727096211.38759: checking for max_fail_percentage 15500 1727096211.38761: done checking for max_fail_percentage 15500 1727096211.38762: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.38763: done checking to see if all hosts have failed 15500 1727096211.38763: getting the remaining hosts for this loop 15500 1727096211.38765: done getting the remaining hosts for this loop 15500 1727096211.38770: getting the next task for host managed_node1 15500 1727096211.38776: done getting next task for host managed_node1 15500 1727096211.38781: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15500 1727096211.38783: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.38796: getting variables 15500 1727096211.38798: in VariableManager get_vars() 15500 1727096211.38832: Calling all_inventory to load vars for managed_node1 15500 1727096211.38835: Calling groups_inventory to load vars for managed_node1 15500 1727096211.38837: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.38844: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.38846: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.38849: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.39639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.40696: done with get_vars() 15500 1727096211.40727: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15500 1727096211.40812: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:51 -0400 (0:00:00.076) 0:00:11.451 ****** 15500 1727096211.40847: entering _queue_task() for managed_node1/yum 15500 1727096211.40849: Creating lock for yum 15500 1727096211.41214: worker is 1 (out of 1 available) 15500 1727096211.41229: exiting _queue_task() for managed_node1/yum 15500 1727096211.41243: done queuing things up, now waiting for results queue to drain 15500 1727096211.41245: waiting for pending results... 15500 1727096211.41520: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15500 1727096211.41588: in run() - task 0afff68d-5257-877d-2da0-00000000001a 15500 1727096211.41600: variable 'ansible_search_path' from source: unknown 15500 1727096211.41604: variable 'ansible_search_path' from source: unknown 15500 1727096211.41633: calling self._execute() 15500 1727096211.41776: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.41780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.41783: variable 'omit' from source: magic vars 15500 1727096211.42152: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.42186: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.42342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096211.44574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096211.44626: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096211.44663: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096211.44787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096211.44791: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096211.44836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.44869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.44891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.44921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.44950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.45036: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.45040: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15500 1727096211.45052: when evaluation is False, skipping this task 15500 1727096211.45055: _execute() done 15500 1727096211.45061: dumping result to json 15500 1727096211.45063: done dumping result, returning 15500 1727096211.45066: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-877d-2da0-00000000001a] 15500 1727096211.45076: sending task result for task 0afff68d-5257-877d-2da0-00000000001a skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15500 1727096211.45293: no more pending results, returning what we have 15500 1727096211.45296: results queue empty 15500 1727096211.45297: checking for any_errors_fatal 15500 1727096211.45303: done checking for any_errors_fatal 15500 1727096211.45304: checking for max_fail_percentage 15500 1727096211.45305: done checking for max_fail_percentage 15500 1727096211.45306: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.45307: done checking to see if all hosts have failed 15500 1727096211.45307: getting the remaining hosts for this loop 15500 1727096211.45309: done getting the remaining hosts for this loop 15500 1727096211.45312: getting the next task for host managed_node1 15500 1727096211.45318: done getting next task for host managed_node1 15500 1727096211.45321: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15500 1727096211.45323: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.45338: getting variables 15500 1727096211.45340: in VariableManager get_vars() 15500 1727096211.45376: Calling all_inventory to load vars for managed_node1 15500 1727096211.45379: Calling groups_inventory to load vars for managed_node1 15500 1727096211.45381: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.45389: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.45391: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.45393: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.45962: done sending task result for task 0afff68d-5257-877d-2da0-00000000001a 15500 1727096211.45966: WORKER PROCESS EXITING 15500 1727096211.46364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.47219: done with get_vars() 15500 1727096211.47241: done getting variables 15500 1727096211.47290: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:51 -0400 (0:00:00.064) 0:00:11.516 ****** 15500 1727096211.47316: entering _queue_task() for managed_node1/fail 15500 1727096211.47582: worker is 1 (out of 1 available) 15500 1727096211.47595: exiting _queue_task() for managed_node1/fail 15500 1727096211.47608: done queuing things up, now waiting for results queue to drain 15500 1727096211.47609: waiting for pending results... 15500 1727096211.47801: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15500 1727096211.47867: in run() - task 0afff68d-5257-877d-2da0-00000000001b 15500 1727096211.47881: variable 'ansible_search_path' from source: unknown 15500 1727096211.47884: variable 'ansible_search_path' from source: unknown 15500 1727096211.47914: calling self._execute() 15500 1727096211.47988: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.47992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.48001: variable 'omit' from source: magic vars 15500 1727096211.48280: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.48290: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.48372: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096211.48507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096211.50806: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096211.50848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096211.50880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096211.50916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096211.50937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096211.51031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.51173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.51176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.51179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.51181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.51221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.51238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.51255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.51289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.51297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.51327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.51349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.51471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.51474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.51477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.51588: variable 'network_connections' from source: play vars 15500 1727096211.51599: variable 'interface' from source: set_fact 15500 1727096211.51710: variable 'interface' from source: set_fact 15500 1727096211.51714: variable 'interface' from source: set_fact 15500 1727096211.51742: variable 'interface' from source: set_fact 15500 1727096211.51827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096211.52224: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096211.52254: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096211.52282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096211.52304: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096211.52339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096211.52354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096211.52378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.52397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096211.52446: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096211.52629: variable 'network_connections' from source: play vars 15500 1727096211.52633: variable 'interface' from source: set_fact 15500 1727096211.52681: variable 'interface' from source: set_fact 15500 1727096211.52687: variable 'interface' from source: set_fact 15500 1727096211.52729: variable 'interface' from source: set_fact 15500 1727096211.52756: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096211.52759: when evaluation is False, skipping this task 15500 1727096211.52763: _execute() done 15500 1727096211.52766: dumping result to json 15500 1727096211.52771: done dumping result, returning 15500 1727096211.52779: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-877d-2da0-00000000001b] 15500 1727096211.52789: sending task result for task 0afff68d-5257-877d-2da0-00000000001b 15500 1727096211.52875: done sending task result for task 0afff68d-5257-877d-2da0-00000000001b 15500 1727096211.52877: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096211.52923: no more pending results, returning what we have 15500 1727096211.52927: results queue empty 15500 1727096211.52928: checking for any_errors_fatal 15500 1727096211.52933: done checking for any_errors_fatal 15500 1727096211.52933: checking for max_fail_percentage 15500 1727096211.52935: done checking for max_fail_percentage 15500 1727096211.52936: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.52937: done checking to see if all hosts have failed 15500 1727096211.52937: getting the remaining hosts for this loop 15500 1727096211.52939: done getting the remaining hosts for this loop 15500 1727096211.52943: getting the next task for host managed_node1 15500 1727096211.52948: done getting next task for host managed_node1 15500 1727096211.52951: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15500 1727096211.52953: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.52966: getting variables 15500 1727096211.52975: in VariableManager get_vars() 15500 1727096211.53012: Calling all_inventory to load vars for managed_node1 15500 1727096211.53014: Calling groups_inventory to load vars for managed_node1 15500 1727096211.53016: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.53026: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.53031: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.53034: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.54132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.55149: done with get_vars() 15500 1727096211.55190: done getting variables 15500 1727096211.55242: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:51 -0400 (0:00:00.079) 0:00:11.595 ****** 15500 1727096211.55268: entering _queue_task() for managed_node1/package 15500 1727096211.55549: worker is 1 (out of 1 available) 15500 1727096211.55562: exiting _queue_task() for managed_node1/package 15500 1727096211.55577: done queuing things up, now waiting for results queue to drain 15500 1727096211.55579: waiting for pending results... 15500 1727096211.55793: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15500 1727096211.55874: in run() - task 0afff68d-5257-877d-2da0-00000000001c 15500 1727096211.55886: variable 'ansible_search_path' from source: unknown 15500 1727096211.55890: variable 'ansible_search_path' from source: unknown 15500 1727096211.55945: calling self._execute() 15500 1727096211.56037: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.56052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.56063: variable 'omit' from source: magic vars 15500 1727096211.56436: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.56445: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.56659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096211.56971: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096211.57011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096211.57041: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096211.57082: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096211.57221: variable 'network_packages' from source: role '' defaults 15500 1727096211.57351: variable '__network_provider_setup' from source: role '' defaults 15500 1727096211.57363: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096211.57434: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096211.57437: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096211.57486: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096211.57648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096211.59078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096211.59131: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096211.59162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096211.59188: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096211.59208: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096211.59275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.59295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.59312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.59338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.59348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.59388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.59404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.59420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.59445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.59455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.59610: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15500 1727096211.59693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.59709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.59777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.59780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.59828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.59877: variable 'ansible_python' from source: facts 15500 1727096211.59901: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15500 1727096211.59962: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096211.60024: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096211.60109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.60132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.60147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.60178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.60188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.60222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.60244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.60264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.60291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.60301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.60581: variable 'network_connections' from source: play vars 15500 1727096211.60584: variable 'interface' from source: set_fact 15500 1727096211.60644: variable 'interface' from source: set_fact 15500 1727096211.60652: variable 'interface' from source: set_fact 15500 1727096211.60726: variable 'interface' from source: set_fact 15500 1727096211.60785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096211.60808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096211.61072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.61076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096211.61086: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096211.61446: variable 'network_connections' from source: play vars 15500 1727096211.61450: variable 'interface' from source: set_fact 15500 1727096211.61672: variable 'interface' from source: set_fact 15500 1727096211.61676: variable 'interface' from source: set_fact 15500 1727096211.61722: variable 'interface' from source: set_fact 15500 1727096211.61908: variable '__network_packages_default_wireless' from source: role '' defaults 15500 1727096211.61911: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096211.62107: variable 'network_connections' from source: play vars 15500 1727096211.62111: variable 'interface' from source: set_fact 15500 1727096211.62163: variable 'interface' from source: set_fact 15500 1727096211.62170: variable 'interface' from source: set_fact 15500 1727096211.62213: variable 'interface' from source: set_fact 15500 1727096211.62244: variable '__network_packages_default_team' from source: role '' defaults 15500 1727096211.62472: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096211.62632: variable 'network_connections' from source: play vars 15500 1727096211.62643: variable 'interface' from source: set_fact 15500 1727096211.62710: variable 'interface' from source: set_fact 15500 1727096211.62722: variable 'interface' from source: set_fact 15500 1727096211.62788: variable 'interface' from source: set_fact 15500 1727096211.62851: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096211.62915: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096211.62927: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096211.62993: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096211.63225: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15500 1727096211.63679: variable 'network_connections' from source: play vars 15500 1727096211.63689: variable 'interface' from source: set_fact 15500 1727096211.63749: variable 'interface' from source: set_fact 15500 1727096211.63761: variable 'interface' from source: set_fact 15500 1727096211.63827: variable 'interface' from source: set_fact 15500 1727096211.63844: variable 'ansible_distribution' from source: facts 15500 1727096211.63854: variable '__network_rh_distros' from source: role '' defaults 15500 1727096211.63864: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.63902: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15500 1727096211.64070: variable 'ansible_distribution' from source: facts 15500 1727096211.64084: variable '__network_rh_distros' from source: role '' defaults 15500 1727096211.64095: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.64109: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15500 1727096211.64263: variable 'ansible_distribution' from source: facts 15500 1727096211.64272: variable '__network_rh_distros' from source: role '' defaults 15500 1727096211.64275: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.64302: variable 'network_provider' from source: set_fact 15500 1727096211.64313: variable 'ansible_facts' from source: unknown 15500 1727096211.64744: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15500 1727096211.64748: when evaluation is False, skipping this task 15500 1727096211.64750: _execute() done 15500 1727096211.64753: dumping result to json 15500 1727096211.64755: done dumping result, returning 15500 1727096211.64763: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-877d-2da0-00000000001c] 15500 1727096211.64769: sending task result for task 0afff68d-5257-877d-2da0-00000000001c 15500 1727096211.64862: done sending task result for task 0afff68d-5257-877d-2da0-00000000001c 15500 1727096211.64864: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15500 1727096211.64915: no more pending results, returning what we have 15500 1727096211.64919: results queue empty 15500 1727096211.64919: checking for any_errors_fatal 15500 1727096211.64925: done checking for any_errors_fatal 15500 1727096211.64926: checking for max_fail_percentage 15500 1727096211.64927: done checking for max_fail_percentage 15500 1727096211.64928: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.64929: done checking to see if all hosts have failed 15500 1727096211.64929: getting the remaining hosts for this loop 15500 1727096211.64931: done getting the remaining hosts for this loop 15500 1727096211.64935: getting the next task for host managed_node1 15500 1727096211.64941: done getting next task for host managed_node1 15500 1727096211.64944: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15500 1727096211.64947: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.64961: getting variables 15500 1727096211.64963: in VariableManager get_vars() 15500 1727096211.65008: Calling all_inventory to load vars for managed_node1 15500 1727096211.65011: Calling groups_inventory to load vars for managed_node1 15500 1727096211.65013: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.65030: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.65033: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.65036: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.65844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.67206: done with get_vars() 15500 1727096211.67236: done getting variables 15500 1727096211.67302: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:51 -0400 (0:00:00.120) 0:00:11.716 ****** 15500 1727096211.67335: entering _queue_task() for managed_node1/package 15500 1727096211.67696: worker is 1 (out of 1 available) 15500 1727096211.67708: exiting _queue_task() for managed_node1/package 15500 1727096211.67721: done queuing things up, now waiting for results queue to drain 15500 1727096211.67722: waiting for pending results... 15500 1727096211.68097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15500 1727096211.68175: in run() - task 0afff68d-5257-877d-2da0-00000000001d 15500 1727096211.68180: variable 'ansible_search_path' from source: unknown 15500 1727096211.68183: variable 'ansible_search_path' from source: unknown 15500 1727096211.68373: calling self._execute() 15500 1727096211.68377: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.68379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.68382: variable 'omit' from source: magic vars 15500 1727096211.68741: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.68761: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.68891: variable 'network_state' from source: role '' defaults 15500 1727096211.68910: Evaluated conditional (network_state != {}): False 15500 1727096211.68918: when evaluation is False, skipping this task 15500 1727096211.68929: _execute() done 15500 1727096211.68939: dumping result to json 15500 1727096211.68947: done dumping result, returning 15500 1727096211.68964: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-877d-2da0-00000000001d] 15500 1727096211.68976: sending task result for task 0afff68d-5257-877d-2da0-00000000001d 15500 1727096211.69277: done sending task result for task 0afff68d-5257-877d-2da0-00000000001d 15500 1727096211.69281: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096211.69331: no more pending results, returning what we have 15500 1727096211.69336: results queue empty 15500 1727096211.69337: checking for any_errors_fatal 15500 1727096211.69344: done checking for any_errors_fatal 15500 1727096211.69345: checking for max_fail_percentage 15500 1727096211.69347: done checking for max_fail_percentage 15500 1727096211.69347: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.69348: done checking to see if all hosts have failed 15500 1727096211.69349: getting the remaining hosts for this loop 15500 1727096211.69351: done getting the remaining hosts for this loop 15500 1727096211.69355: getting the next task for host managed_node1 15500 1727096211.69366: done getting next task for host managed_node1 15500 1727096211.69373: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15500 1727096211.69375: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.69389: getting variables 15500 1727096211.69391: in VariableManager get_vars() 15500 1727096211.69427: Calling all_inventory to load vars for managed_node1 15500 1727096211.69429: Calling groups_inventory to load vars for managed_node1 15500 1727096211.69431: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.69442: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.69444: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.69447: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.75188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.76723: done with get_vars() 15500 1727096211.76763: done getting variables 15500 1727096211.76818: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:51 -0400 (0:00:00.095) 0:00:11.811 ****** 15500 1727096211.76846: entering _queue_task() for managed_node1/package 15500 1727096211.77212: worker is 1 (out of 1 available) 15500 1727096211.77225: exiting _queue_task() for managed_node1/package 15500 1727096211.77240: done queuing things up, now waiting for results queue to drain 15500 1727096211.77241: waiting for pending results... 15500 1727096211.77519: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15500 1727096211.77640: in run() - task 0afff68d-5257-877d-2da0-00000000001e 15500 1727096211.77663: variable 'ansible_search_path' from source: unknown 15500 1727096211.77674: variable 'ansible_search_path' from source: unknown 15500 1727096211.77716: calling self._execute() 15500 1727096211.77824: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.77838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.77852: variable 'omit' from source: magic vars 15500 1727096211.78243: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.78265: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.78454: variable 'network_state' from source: role '' defaults 15500 1727096211.78459: Evaluated conditional (network_state != {}): False 15500 1727096211.78462: when evaluation is False, skipping this task 15500 1727096211.78464: _execute() done 15500 1727096211.78466: dumping result to json 15500 1727096211.78470: done dumping result, returning 15500 1727096211.78473: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-877d-2da0-00000000001e] 15500 1727096211.78476: sending task result for task 0afff68d-5257-877d-2da0-00000000001e 15500 1727096211.78553: done sending task result for task 0afff68d-5257-877d-2da0-00000000001e 15500 1727096211.78772: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096211.78820: no more pending results, returning what we have 15500 1727096211.78824: results queue empty 15500 1727096211.78825: checking for any_errors_fatal 15500 1727096211.78832: done checking for any_errors_fatal 15500 1727096211.78832: checking for max_fail_percentage 15500 1727096211.78834: done checking for max_fail_percentage 15500 1727096211.78835: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.78836: done checking to see if all hosts have failed 15500 1727096211.78837: getting the remaining hosts for this loop 15500 1727096211.78839: done getting the remaining hosts for this loop 15500 1727096211.78842: getting the next task for host managed_node1 15500 1727096211.78848: done getting next task for host managed_node1 15500 1727096211.78852: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15500 1727096211.78855: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.78873: getting variables 15500 1727096211.78875: in VariableManager get_vars() 15500 1727096211.78917: Calling all_inventory to load vars for managed_node1 15500 1727096211.78920: Calling groups_inventory to load vars for managed_node1 15500 1727096211.78923: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.78934: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.78937: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.78940: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.80539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.82306: done with get_vars() 15500 1727096211.82335: done getting variables 15500 1727096211.82441: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:51 -0400 (0:00:00.056) 0:00:11.867 ****** 15500 1727096211.82477: entering _queue_task() for managed_node1/service 15500 1727096211.82479: Creating lock for service 15500 1727096211.82835: worker is 1 (out of 1 available) 15500 1727096211.82847: exiting _queue_task() for managed_node1/service 15500 1727096211.82863: done queuing things up, now waiting for results queue to drain 15500 1727096211.82864: waiting for pending results... 15500 1727096211.83146: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15500 1727096211.83253: in run() - task 0afff68d-5257-877d-2da0-00000000001f 15500 1727096211.83277: variable 'ansible_search_path' from source: unknown 15500 1727096211.83287: variable 'ansible_search_path' from source: unknown 15500 1727096211.83330: calling self._execute() 15500 1727096211.83431: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.83445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.83464: variable 'omit' from source: magic vars 15500 1727096211.83846: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.83869: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.83999: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096211.84241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096211.86936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096211.87028: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096211.87078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096211.87116: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096211.87148: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096211.87240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.87291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.87315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.87473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.87476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.87479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.87481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.87499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.87542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.87566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.87619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.87648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.87683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.87730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.87750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.87941: variable 'network_connections' from source: play vars 15500 1727096211.87962: variable 'interface' from source: set_fact 15500 1727096211.88142: variable 'interface' from source: set_fact 15500 1727096211.88145: variable 'interface' from source: set_fact 15500 1727096211.88148: variable 'interface' from source: set_fact 15500 1727096211.88209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096211.88398: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096211.88439: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096211.88501: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096211.88534: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096211.88595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096211.88621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096211.88652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.88692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096211.88759: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096211.89016: variable 'network_connections' from source: play vars 15500 1727096211.89027: variable 'interface' from source: set_fact 15500 1727096211.89097: variable 'interface' from source: set_fact 15500 1727096211.89170: variable 'interface' from source: set_fact 15500 1727096211.89181: variable 'interface' from source: set_fact 15500 1727096211.89228: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096211.89238: when evaluation is False, skipping this task 15500 1727096211.89247: _execute() done 15500 1727096211.89259: dumping result to json 15500 1727096211.89262: done dumping result, returning 15500 1727096211.89273: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-877d-2da0-00000000001f] 15500 1727096211.89287: sending task result for task 0afff68d-5257-877d-2da0-00000000001f 15500 1727096211.89370: done sending task result for task 0afff68d-5257-877d-2da0-00000000001f 15500 1727096211.89379: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096211.89432: no more pending results, returning what we have 15500 1727096211.89436: results queue empty 15500 1727096211.89441: checking for any_errors_fatal 15500 1727096211.89447: done checking for any_errors_fatal 15500 1727096211.89448: checking for max_fail_percentage 15500 1727096211.89450: done checking for max_fail_percentage 15500 1727096211.89451: checking to see if all hosts have failed and the running result is not ok 15500 1727096211.89452: done checking to see if all hosts have failed 15500 1727096211.89452: getting the remaining hosts for this loop 15500 1727096211.89454: done getting the remaining hosts for this loop 15500 1727096211.89457: getting the next task for host managed_node1 15500 1727096211.89464: done getting next task for host managed_node1 15500 1727096211.89469: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15500 1727096211.89471: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096211.89483: getting variables 15500 1727096211.89485: in VariableManager get_vars() 15500 1727096211.89523: Calling all_inventory to load vars for managed_node1 15500 1727096211.89525: Calling groups_inventory to load vars for managed_node1 15500 1727096211.89528: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096211.89536: Calling all_plugins_play to load vars for managed_node1 15500 1727096211.89539: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096211.89541: Calling groups_plugins_play to load vars for managed_node1 15500 1727096211.90457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096211.91636: done with get_vars() 15500 1727096211.91671: done getting variables 15500 1727096211.91741: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:51 -0400 (0:00:00.092) 0:00:11.960 ****** 15500 1727096211.91777: entering _queue_task() for managed_node1/service 15500 1727096211.92177: worker is 1 (out of 1 available) 15500 1727096211.92192: exiting _queue_task() for managed_node1/service 15500 1727096211.92207: done queuing things up, now waiting for results queue to drain 15500 1727096211.92208: waiting for pending results... 15500 1727096211.92409: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15500 1727096211.92473: in run() - task 0afff68d-5257-877d-2da0-000000000020 15500 1727096211.92487: variable 'ansible_search_path' from source: unknown 15500 1727096211.92490: variable 'ansible_search_path' from source: unknown 15500 1727096211.92519: calling self._execute() 15500 1727096211.92595: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096211.92599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096211.92609: variable 'omit' from source: magic vars 15500 1727096211.92889: variable 'ansible_distribution_major_version' from source: facts 15500 1727096211.92898: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096211.93014: variable 'network_provider' from source: set_fact 15500 1727096211.93017: variable 'network_state' from source: role '' defaults 15500 1727096211.93030: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15500 1727096211.93037: variable 'omit' from source: magic vars 15500 1727096211.93065: variable 'omit' from source: magic vars 15500 1727096211.93090: variable 'network_service_name' from source: role '' defaults 15500 1727096211.93140: variable 'network_service_name' from source: role '' defaults 15500 1727096211.93273: variable '__network_provider_setup' from source: role '' defaults 15500 1727096211.93277: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096211.93292: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096211.93306: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096211.93371: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096211.93687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096211.97279: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096211.97365: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096211.97414: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096211.97574: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096211.97578: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096211.97583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.97621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.97654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.97703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.97724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.97780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.97812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.97844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.97891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.97915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.98162: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15500 1727096211.98293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.98325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.98358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.98407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.98479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.98707: variable 'ansible_python' from source: facts 15500 1727096211.98734: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15500 1727096211.98818: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096211.98905: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096211.99030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.99063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.99101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.99144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.99164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.99215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096211.99297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096211.99305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096211.99338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096211.99357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096211.99506: variable 'network_connections' from source: play vars 15500 1727096211.99530: variable 'interface' from source: set_fact 15500 1727096211.99599: variable 'interface' from source: set_fact 15500 1727096211.99616: variable 'interface' from source: set_fact 15500 1727096211.99751: variable 'interface' from source: set_fact 15500 1727096211.99807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096212.00040: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096212.00096: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096212.00150: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096212.00196: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096212.00273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096212.00301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096212.00342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096212.00381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096212.00435: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096212.00775: variable 'network_connections' from source: play vars 15500 1727096212.00782: variable 'interface' from source: set_fact 15500 1727096212.00837: variable 'interface' from source: set_fact 15500 1727096212.00846: variable 'interface' from source: set_fact 15500 1727096212.00910: variable 'interface' from source: set_fact 15500 1727096212.00947: variable '__network_packages_default_wireless' from source: role '' defaults 15500 1727096212.01010: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096212.01196: variable 'network_connections' from source: play vars 15500 1727096212.01199: variable 'interface' from source: set_fact 15500 1727096212.01247: variable 'interface' from source: set_fact 15500 1727096212.01253: variable 'interface' from source: set_fact 15500 1727096212.01305: variable 'interface' from source: set_fact 15500 1727096212.01324: variable '__network_packages_default_team' from source: role '' defaults 15500 1727096212.01379: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096212.01563: variable 'network_connections' from source: play vars 15500 1727096212.01566: variable 'interface' from source: set_fact 15500 1727096212.01615: variable 'interface' from source: set_fact 15500 1727096212.01620: variable 'interface' from source: set_fact 15500 1727096212.01673: variable 'interface' from source: set_fact 15500 1727096212.01717: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096212.01762: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096212.01766: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096212.01810: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096212.01947: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15500 1727096212.02403: variable 'network_connections' from source: play vars 15500 1727096212.02406: variable 'interface' from source: set_fact 15500 1727096212.02450: variable 'interface' from source: set_fact 15500 1727096212.02455: variable 'interface' from source: set_fact 15500 1727096212.02504: variable 'interface' from source: set_fact 15500 1727096212.02507: variable 'ansible_distribution' from source: facts 15500 1727096212.02510: variable '__network_rh_distros' from source: role '' defaults 15500 1727096212.02516: variable 'ansible_distribution_major_version' from source: facts 15500 1727096212.02541: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15500 1727096212.02718: variable 'ansible_distribution' from source: facts 15500 1727096212.02721: variable '__network_rh_distros' from source: role '' defaults 15500 1727096212.02724: variable 'ansible_distribution_major_version' from source: facts 15500 1727096212.02726: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15500 1727096212.02973: variable 'ansible_distribution' from source: facts 15500 1727096212.02977: variable '__network_rh_distros' from source: role '' defaults 15500 1727096212.02979: variable 'ansible_distribution_major_version' from source: facts 15500 1727096212.02981: variable 'network_provider' from source: set_fact 15500 1727096212.02983: variable 'omit' from source: magic vars 15500 1727096212.02985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096212.03010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096212.03020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096212.03039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096212.03049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096212.03080: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096212.03083: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096212.03086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096212.03187: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096212.03191: Set connection var ansible_pipelining to False 15500 1727096212.03197: Set connection var ansible_timeout to 10 15500 1727096212.03200: Set connection var ansible_shell_type to sh 15500 1727096212.03205: Set connection var ansible_shell_executable to /bin/sh 15500 1727096212.03209: Set connection var ansible_connection to ssh 15500 1727096212.03234: variable 'ansible_shell_executable' from source: unknown 15500 1727096212.03238: variable 'ansible_connection' from source: unknown 15500 1727096212.03240: variable 'ansible_module_compression' from source: unknown 15500 1727096212.03242: variable 'ansible_shell_type' from source: unknown 15500 1727096212.03245: variable 'ansible_shell_executable' from source: unknown 15500 1727096212.03247: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096212.03254: variable 'ansible_pipelining' from source: unknown 15500 1727096212.03259: variable 'ansible_timeout' from source: unknown 15500 1727096212.03261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096212.03443: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096212.03447: variable 'omit' from source: magic vars 15500 1727096212.03449: starting attempt loop 15500 1727096212.03452: running the handler 15500 1727096212.03454: variable 'ansible_facts' from source: unknown 15500 1727096212.04095: _low_level_execute_command(): starting 15500 1727096212.04099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096212.04623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096212.04628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096212.04631: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096212.04634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096212.04693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096212.04696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096212.04702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096212.04778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096212.06514: stdout chunk (state=3): >>>/root <<< 15500 1727096212.06674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096212.06689: stderr chunk (state=3): >>><<< 15500 1727096212.06719: stdout chunk (state=3): >>><<< 15500 1727096212.06737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096212.06754: _low_level_execute_command(): starting 15500 1727096212.06773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876 `" && echo ansible-tmp-1727096212.0674255-16047-165540260249876="` echo /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876 `" ) && sleep 0' 15500 1727096212.07445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096212.07455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096212.07565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096212.09541: stdout chunk (state=3): >>>ansible-tmp-1727096212.0674255-16047-165540260249876=/root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876 <<< 15500 1727096212.09691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096212.09729: stderr chunk (state=3): >>><<< 15500 1727096212.09732: stdout chunk (state=3): >>><<< 15500 1727096212.09766: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096212.0674255-16047-165540260249876=/root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096212.09797: variable 'ansible_module_compression' from source: unknown 15500 1727096212.09915: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15500 1727096212.09923: ANSIBALLZ: Acquiring lock 15500 1727096212.09926: ANSIBALLZ: Lock acquired: 140712178847904 15500 1727096212.09928: ANSIBALLZ: Creating module 15500 1727096212.36875: ANSIBALLZ: Writing module into payload 15500 1727096212.36945: ANSIBALLZ: Writing module 15500 1727096212.36983: ANSIBALLZ: Renaming module 15500 1727096212.36996: ANSIBALLZ: Done creating module 15500 1727096212.37026: variable 'ansible_facts' from source: unknown 15500 1727096212.37230: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/AnsiballZ_systemd.py 15500 1727096212.37390: Sending initial data 15500 1727096212.37402: Sent initial data (156 bytes) 15500 1727096212.37974: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096212.38081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096212.38156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096212.38221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096212.39912: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096212.40048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096212.40375: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp_f0hyx3c /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/AnsiballZ_systemd.py <<< 15500 1727096212.40378: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/AnsiballZ_systemd.py" <<< 15500 1727096212.40518: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp_f0hyx3c" to remote "/root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/AnsiballZ_systemd.py" <<< 15500 1727096212.43832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096212.43880: stderr chunk (state=3): >>><<< 15500 1727096212.43890: stdout chunk (state=3): >>><<< 15500 1727096212.43991: done transferring module to remote 15500 1727096212.44014: _low_level_execute_command(): starting 15500 1727096212.44024: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/ /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/AnsiballZ_systemd.py && sleep 0' 15500 1727096212.44721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096212.44734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096212.44801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096212.44829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096212.44845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096212.44917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096212.46851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096212.46970: stdout chunk (state=3): >>><<< 15500 1727096212.46978: stderr chunk (state=3): >>><<< 15500 1727096212.46982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096212.46985: _low_level_execute_command(): starting 15500 1727096212.46988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/AnsiballZ_systemd.py && sleep 0' 15500 1727096212.47798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096212.47802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096212.47804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096212.47806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096212.47808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096212.47810: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096212.47876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096212.47985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096212.48216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096212.77845: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10596352", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303964672", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "729286000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15500 1727096212.77936: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15500 1727096212.79965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096212.80063: stderr chunk (state=3): >>><<< 15500 1727096212.80083: stdout chunk (state=3): >>><<< 15500 1727096212.80213: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10596352", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303964672", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "729286000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096212.80346: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096212.80350: _low_level_execute_command(): starting 15500 1727096212.80353: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096212.0674255-16047-165540260249876/ > /dev/null 2>&1 && sleep 0' 15500 1727096212.81049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096212.81052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096212.81055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096212.81064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096212.81069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096212.81123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096212.81126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096212.81132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096212.81206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096212.83160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096212.83169: stdout chunk (state=3): >>><<< 15500 1727096212.83172: stderr chunk (state=3): >>><<< 15500 1727096212.83212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096212.83216: handler run complete 15500 1727096212.83275: attempt loop complete, returning result 15500 1727096212.83278: _execute() done 15500 1727096212.83280: dumping result to json 15500 1727096212.83282: done dumping result, returning 15500 1727096212.83284: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-877d-2da0-000000000020] 15500 1727096212.83286: sending task result for task 0afff68d-5257-877d-2da0-000000000020 15500 1727096212.83509: done sending task result for task 0afff68d-5257-877d-2da0-000000000020 15500 1727096212.83512: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096212.83559: no more pending results, returning what we have 15500 1727096212.83563: results queue empty 15500 1727096212.83563: checking for any_errors_fatal 15500 1727096212.83571: done checking for any_errors_fatal 15500 1727096212.83572: checking for max_fail_percentage 15500 1727096212.83574: done checking for max_fail_percentage 15500 1727096212.83574: checking to see if all hosts have failed and the running result is not ok 15500 1727096212.83575: done checking to see if all hosts have failed 15500 1727096212.83576: getting the remaining hosts for this loop 15500 1727096212.83577: done getting the remaining hosts for this loop 15500 1727096212.83581: getting the next task for host managed_node1 15500 1727096212.83586: done getting next task for host managed_node1 15500 1727096212.83590: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15500 1727096212.83591: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096212.83600: getting variables 15500 1727096212.83601: in VariableManager get_vars() 15500 1727096212.83638: Calling all_inventory to load vars for managed_node1 15500 1727096212.83641: Calling groups_inventory to load vars for managed_node1 15500 1727096212.83643: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096212.83652: Calling all_plugins_play to load vars for managed_node1 15500 1727096212.83654: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096212.83657: Calling groups_plugins_play to load vars for managed_node1 15500 1727096212.84541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096212.86937: done with get_vars() 15500 1727096212.86987: done getting variables 15500 1727096212.87059: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:52 -0400 (0:00:00.953) 0:00:12.914 ****** 15500 1727096212.87133: entering _queue_task() for managed_node1/service 15500 1727096212.88194: worker is 1 (out of 1 available) 15500 1727096212.88207: exiting _queue_task() for managed_node1/service 15500 1727096212.88229: done queuing things up, now waiting for results queue to drain 15500 1727096212.88231: waiting for pending results... 15500 1727096212.88584: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15500 1727096212.88590: in run() - task 0afff68d-5257-877d-2da0-000000000021 15500 1727096212.88593: variable 'ansible_search_path' from source: unknown 15500 1727096212.88595: variable 'ansible_search_path' from source: unknown 15500 1727096212.88597: calling self._execute() 15500 1727096212.88661: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096212.88683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096212.88700: variable 'omit' from source: magic vars 15500 1727096212.89082: variable 'ansible_distribution_major_version' from source: facts 15500 1727096212.89100: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096212.89215: variable 'network_provider' from source: set_fact 15500 1727096212.89226: Evaluated conditional (network_provider == "nm"): True 15500 1727096212.89319: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096212.89472: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096212.89613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096212.91592: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096212.91665: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096212.91710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096212.91873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096212.91879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096212.91886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096212.91921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096212.91951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096212.92001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096212.92021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096212.92077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096212.92109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096212.92139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096212.92187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096212.92208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096212.92254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096212.92284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096212.92311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096212.92352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096212.92373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096212.92526: variable 'network_connections' from source: play vars 15500 1727096212.92546: variable 'interface' from source: set_fact 15500 1727096212.92626: variable 'interface' from source: set_fact 15500 1727096212.92642: variable 'interface' from source: set_fact 15500 1727096212.92708: variable 'interface' from source: set_fact 15500 1727096212.92872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096212.92982: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096212.93030: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096212.93066: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096212.93108: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096212.93162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096212.93193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096212.93224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096212.93260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096212.93312: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096212.93583: variable 'network_connections' from source: play vars 15500 1727096212.93593: variable 'interface' from source: set_fact 15500 1727096212.93664: variable 'interface' from source: set_fact 15500 1727096212.93678: variable 'interface' from source: set_fact 15500 1727096212.93773: variable 'interface' from source: set_fact 15500 1727096212.93797: Evaluated conditional (__network_wpa_supplicant_required): False 15500 1727096212.93803: when evaluation is False, skipping this task 15500 1727096212.93810: _execute() done 15500 1727096212.93824: dumping result to json 15500 1727096212.93830: done dumping result, returning 15500 1727096212.93842: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-877d-2da0-000000000021] 15500 1727096212.93872: sending task result for task 0afff68d-5257-877d-2da0-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15500 1727096212.94035: no more pending results, returning what we have 15500 1727096212.94039: results queue empty 15500 1727096212.94039: checking for any_errors_fatal 15500 1727096212.94063: done checking for any_errors_fatal 15500 1727096212.94064: checking for max_fail_percentage 15500 1727096212.94066: done checking for max_fail_percentage 15500 1727096212.94068: checking to see if all hosts have failed and the running result is not ok 15500 1727096212.94069: done checking to see if all hosts have failed 15500 1727096212.94070: getting the remaining hosts for this loop 15500 1727096212.94072: done getting the remaining hosts for this loop 15500 1727096212.94077: getting the next task for host managed_node1 15500 1727096212.94082: done getting next task for host managed_node1 15500 1727096212.94089: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15500 1727096212.94094: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096212.94109: getting variables 15500 1727096212.94111: in VariableManager get_vars() 15500 1727096212.94151: Calling all_inventory to load vars for managed_node1 15500 1727096212.94154: Calling groups_inventory to load vars for managed_node1 15500 1727096212.94156: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096212.94338: Calling all_plugins_play to load vars for managed_node1 15500 1727096212.94342: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096212.94347: done sending task result for task 0afff68d-5257-877d-2da0-000000000021 15500 1727096212.94350: WORKER PROCESS EXITING 15500 1727096212.94353: Calling groups_plugins_play to load vars for managed_node1 15500 1727096212.95697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096212.97274: done with get_vars() 15500 1727096212.97307: done getting variables 15500 1727096212.97376: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:52 -0400 (0:00:00.102) 0:00:13.017 ****** 15500 1727096212.97410: entering _queue_task() for managed_node1/service 15500 1727096212.97759: worker is 1 (out of 1 available) 15500 1727096212.97973: exiting _queue_task() for managed_node1/service 15500 1727096212.97985: done queuing things up, now waiting for results queue to drain 15500 1727096212.97986: waiting for pending results... 15500 1727096212.98059: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15500 1727096212.98171: in run() - task 0afff68d-5257-877d-2da0-000000000022 15500 1727096212.98194: variable 'ansible_search_path' from source: unknown 15500 1727096212.98203: variable 'ansible_search_path' from source: unknown 15500 1727096212.98251: calling self._execute() 15500 1727096212.98360: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096212.98377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096212.98393: variable 'omit' from source: magic vars 15500 1727096212.98870: variable 'ansible_distribution_major_version' from source: facts 15500 1727096212.98875: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096212.98953: variable 'network_provider' from source: set_fact 15500 1727096212.98970: Evaluated conditional (network_provider == "initscripts"): False 15500 1727096212.98980: when evaluation is False, skipping this task 15500 1727096212.98988: _execute() done 15500 1727096212.98995: dumping result to json 15500 1727096212.99002: done dumping result, returning 15500 1727096212.99012: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-877d-2da0-000000000022] 15500 1727096212.99021: sending task result for task 0afff68d-5257-877d-2da0-000000000022 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096212.99261: no more pending results, returning what we have 15500 1727096212.99266: results queue empty 15500 1727096212.99267: checking for any_errors_fatal 15500 1727096212.99276: done checking for any_errors_fatal 15500 1727096212.99277: checking for max_fail_percentage 15500 1727096212.99279: done checking for max_fail_percentage 15500 1727096212.99280: checking to see if all hosts have failed and the running result is not ok 15500 1727096212.99281: done checking to see if all hosts have failed 15500 1727096212.99282: getting the remaining hosts for this loop 15500 1727096212.99283: done getting the remaining hosts for this loop 15500 1727096212.99287: getting the next task for host managed_node1 15500 1727096212.99294: done getting next task for host managed_node1 15500 1727096212.99297: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15500 1727096212.99301: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096212.99316: getting variables 15500 1727096212.99318: in VariableManager get_vars() 15500 1727096212.99362: Calling all_inventory to load vars for managed_node1 15500 1727096212.99366: Calling groups_inventory to load vars for managed_node1 15500 1727096212.99572: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096212.99584: Calling all_plugins_play to load vars for managed_node1 15500 1727096212.99587: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096212.99590: Calling groups_plugins_play to load vars for managed_node1 15500 1727096213.00183: done sending task result for task 0afff68d-5257-877d-2da0-000000000022 15500 1727096213.00187: WORKER PROCESS EXITING 15500 1727096213.01130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096213.02697: done with get_vars() 15500 1727096213.02729: done getting variables 15500 1727096213.02804: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:53 -0400 (0:00:00.054) 0:00:13.071 ****** 15500 1727096213.02839: entering _queue_task() for managed_node1/copy 15500 1727096213.03384: worker is 1 (out of 1 available) 15500 1727096213.03395: exiting _queue_task() for managed_node1/copy 15500 1727096213.03406: done queuing things up, now waiting for results queue to drain 15500 1727096213.03407: waiting for pending results... 15500 1727096213.03655: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15500 1727096213.03760: in run() - task 0afff68d-5257-877d-2da0-000000000023 15500 1727096213.03788: variable 'ansible_search_path' from source: unknown 15500 1727096213.03803: variable 'ansible_search_path' from source: unknown 15500 1727096213.03848: calling self._execute() 15500 1727096213.03951: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096213.03965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096213.03988: variable 'omit' from source: magic vars 15500 1727096213.04406: variable 'ansible_distribution_major_version' from source: facts 15500 1727096213.04424: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096213.04771: variable 'network_provider' from source: set_fact 15500 1727096213.04776: Evaluated conditional (network_provider == "initscripts"): False 15500 1727096213.04780: when evaluation is False, skipping this task 15500 1727096213.04783: _execute() done 15500 1727096213.04785: dumping result to json 15500 1727096213.04787: done dumping result, returning 15500 1727096213.04790: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-877d-2da0-000000000023] 15500 1727096213.04793: sending task result for task 0afff68d-5257-877d-2da0-000000000023 15500 1727096213.04874: done sending task result for task 0afff68d-5257-877d-2da0-000000000023 15500 1727096213.04877: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15500 1727096213.04925: no more pending results, returning what we have 15500 1727096213.04930: results queue empty 15500 1727096213.04931: checking for any_errors_fatal 15500 1727096213.04937: done checking for any_errors_fatal 15500 1727096213.04938: checking for max_fail_percentage 15500 1727096213.04940: done checking for max_fail_percentage 15500 1727096213.04941: checking to see if all hosts have failed and the running result is not ok 15500 1727096213.04942: done checking to see if all hosts have failed 15500 1727096213.04943: getting the remaining hosts for this loop 15500 1727096213.04944: done getting the remaining hosts for this loop 15500 1727096213.04948: getting the next task for host managed_node1 15500 1727096213.04955: done getting next task for host managed_node1 15500 1727096213.04959: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15500 1727096213.04962: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096213.04978: getting variables 15500 1727096213.04980: in VariableManager get_vars() 15500 1727096213.05019: Calling all_inventory to load vars for managed_node1 15500 1727096213.05022: Calling groups_inventory to load vars for managed_node1 15500 1727096213.05025: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096213.05039: Calling all_plugins_play to load vars for managed_node1 15500 1727096213.05042: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096213.05046: Calling groups_plugins_play to load vars for managed_node1 15500 1727096213.06551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096213.08380: done with get_vars() 15500 1727096213.08404: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:53 -0400 (0:00:00.056) 0:00:13.128 ****** 15500 1727096213.08510: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15500 1727096213.08512: Creating lock for fedora.linux_system_roles.network_connections 15500 1727096213.08912: worker is 1 (out of 1 available) 15500 1727096213.08927: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15500 1727096213.08961: done queuing things up, now waiting for results queue to drain 15500 1727096213.08963: waiting for pending results... 15500 1727096213.09137: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15500 1727096213.09247: in run() - task 0afff68d-5257-877d-2da0-000000000024 15500 1727096213.09470: variable 'ansible_search_path' from source: unknown 15500 1727096213.09480: variable 'ansible_search_path' from source: unknown 15500 1727096213.09483: calling self._execute() 15500 1727096213.09486: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096213.09489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096213.09492: variable 'omit' from source: magic vars 15500 1727096213.09806: variable 'ansible_distribution_major_version' from source: facts 15500 1727096213.09823: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096213.09835: variable 'omit' from source: magic vars 15500 1727096213.09877: variable 'omit' from source: magic vars 15500 1727096213.10035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096213.12026: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096213.12098: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096213.12139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096213.12181: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096213.12212: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096213.12297: variable 'network_provider' from source: set_fact 15500 1727096213.12427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096213.12477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096213.12507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096213.12545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096213.12560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096213.12637: variable 'omit' from source: magic vars 15500 1727096213.12751: variable 'omit' from source: magic vars 15500 1727096213.12853: variable 'network_connections' from source: play vars 15500 1727096213.12871: variable 'interface' from source: set_fact 15500 1727096213.12937: variable 'interface' from source: set_fact 15500 1727096213.12949: variable 'interface' from source: set_fact 15500 1727096213.13010: variable 'interface' from source: set_fact 15500 1727096213.13153: variable 'omit' from source: magic vars 15500 1727096213.13169: variable '__lsr_ansible_managed' from source: task vars 15500 1727096213.13231: variable '__lsr_ansible_managed' from source: task vars 15500 1727096213.13403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15500 1727096213.13615: Loaded config def from plugin (lookup/template) 15500 1727096213.13627: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15500 1727096213.13660: File lookup term: get_ansible_managed.j2 15500 1727096213.13670: variable 'ansible_search_path' from source: unknown 15500 1727096213.13682: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15500 1727096213.13874: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15500 1727096213.13877: variable 'ansible_search_path' from source: unknown 15500 1727096213.19469: variable 'ansible_managed' from source: unknown 15500 1727096213.19607: variable 'omit' from source: magic vars 15500 1727096213.19639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096213.19672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096213.19697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096213.19720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096213.19735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096213.19771: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096213.19780: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096213.19788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096213.19890: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096213.19901: Set connection var ansible_pipelining to False 15500 1727096213.19911: Set connection var ansible_timeout to 10 15500 1727096213.19918: Set connection var ansible_shell_type to sh 15500 1727096213.19928: Set connection var ansible_shell_executable to /bin/sh 15500 1727096213.19938: Set connection var ansible_connection to ssh 15500 1727096213.19964: variable 'ansible_shell_executable' from source: unknown 15500 1727096213.19976: variable 'ansible_connection' from source: unknown 15500 1727096213.19984: variable 'ansible_module_compression' from source: unknown 15500 1727096213.19992: variable 'ansible_shell_type' from source: unknown 15500 1727096213.19999: variable 'ansible_shell_executable' from source: unknown 15500 1727096213.20006: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096213.20014: variable 'ansible_pipelining' from source: unknown 15500 1727096213.20022: variable 'ansible_timeout' from source: unknown 15500 1727096213.20029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096213.20160: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096213.20188: variable 'omit' from source: magic vars 15500 1727096213.20201: starting attempt loop 15500 1727096213.20208: running the handler 15500 1727096213.20224: _low_level_execute_command(): starting 15500 1727096213.20238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096213.20992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096213.21025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096213.21044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096213.21071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096213.21190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096213.22920: stdout chunk (state=3): >>>/root <<< 15500 1727096213.23074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096213.23078: stdout chunk (state=3): >>><<< 15500 1727096213.23080: stderr chunk (state=3): >>><<< 15500 1727096213.23174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096213.23177: _low_level_execute_command(): starting 15500 1727096213.23180: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952 `" && echo ansible-tmp-1727096213.231046-16114-252194658040952="` echo /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952 `" ) && sleep 0' 15500 1727096213.23743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096213.23755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096213.23769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096213.23833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096213.23886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096213.23902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096213.23931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096213.24043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096213.25976: stdout chunk (state=3): >>>ansible-tmp-1727096213.231046-16114-252194658040952=/root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952 <<< 15500 1727096213.26133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096213.26137: stdout chunk (state=3): >>><<< 15500 1727096213.26140: stderr chunk (state=3): >>><<< 15500 1727096213.26162: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096213.231046-16114-252194658040952=/root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096213.26218: variable 'ansible_module_compression' from source: unknown 15500 1727096213.26281: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15500 1727096213.26362: ANSIBALLZ: Acquiring lock 15500 1727096213.26366: ANSIBALLZ: Lock acquired: 140712174719808 15500 1727096213.26371: ANSIBALLZ: Creating module 15500 1727096213.53733: ANSIBALLZ: Writing module into payload 15500 1727096213.54089: ANSIBALLZ: Writing module 15500 1727096213.54120: ANSIBALLZ: Renaming module 15500 1727096213.54131: ANSIBALLZ: Done creating module 15500 1727096213.54169: variable 'ansible_facts' from source: unknown 15500 1727096213.54292: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/AnsiballZ_network_connections.py 15500 1727096213.54541: Sending initial data 15500 1727096213.54544: Sent initial data (167 bytes) 15500 1727096213.55185: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096213.55255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096213.55277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096213.55318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096213.55405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096213.57116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096213.57196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096213.57265: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpqfp92w1j /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/AnsiballZ_network_connections.py <<< 15500 1727096213.57272: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/AnsiballZ_network_connections.py" <<< 15500 1727096213.57364: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpqfp92w1j" to remote "/root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/AnsiballZ_network_connections.py" <<< 15500 1727096213.58506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096213.58537: stderr chunk (state=3): >>><<< 15500 1727096213.58545: stdout chunk (state=3): >>><<< 15500 1727096213.58663: done transferring module to remote 15500 1727096213.58669: _low_level_execute_command(): starting 15500 1727096213.58672: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/ /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/AnsiballZ_network_connections.py && sleep 0' 15500 1727096213.59315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096213.59319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096213.59401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096213.61422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096213.61427: stdout chunk (state=3): >>><<< 15500 1727096213.61430: stderr chunk (state=3): >>><<< 15500 1727096213.61460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096213.61549: _low_level_execute_command(): starting 15500 1727096213.61553: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/AnsiballZ_network_connections.py && sleep 0' 15500 1727096213.62413: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096213.62584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096213.62609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096213.62625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096213.62782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096213.94325: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15500 1727096213.96318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096213.96322: stdout chunk (state=3): >>><<< 15500 1727096213.96324: stderr chunk (state=3): >>><<< 15500 1727096213.96345: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096213.96396: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096213.96424: _low_level_execute_command(): starting 15500 1727096213.96474: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096213.231046-16114-252194658040952/ > /dev/null 2>&1 && sleep 0' 15500 1727096213.97181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096213.97196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096213.97212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096213.97312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096213.99297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096213.99362: stderr chunk (state=3): >>><<< 15500 1727096213.99381: stdout chunk (state=3): >>><<< 15500 1727096213.99403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096213.99416: handler run complete 15500 1727096213.99457: attempt loop complete, returning result 15500 1727096213.99466: _execute() done 15500 1727096213.99480: dumping result to json 15500 1727096213.99492: done dumping result, returning 15500 1727096213.99512: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-877d-2da0-000000000024] 15500 1727096213.99522: sending task result for task 0afff68d-5257-877d-2da0-000000000024 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3 [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3 (not-active) 15500 1727096213.99765: no more pending results, returning what we have 15500 1727096213.99975: results queue empty 15500 1727096213.99976: checking for any_errors_fatal 15500 1727096213.99985: done checking for any_errors_fatal 15500 1727096213.99985: checking for max_fail_percentage 15500 1727096213.99987: done checking for max_fail_percentage 15500 1727096213.99988: checking to see if all hosts have failed and the running result is not ok 15500 1727096213.99989: done checking to see if all hosts have failed 15500 1727096213.99989: getting the remaining hosts for this loop 15500 1727096213.99991: done getting the remaining hosts for this loop 15500 1727096213.99995: getting the next task for host managed_node1 15500 1727096214.00000: done getting next task for host managed_node1 15500 1727096214.00004: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15500 1727096214.00006: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.00016: getting variables 15500 1727096214.00018: in VariableManager get_vars() 15500 1727096214.00052: Calling all_inventory to load vars for managed_node1 15500 1727096214.00055: Calling groups_inventory to load vars for managed_node1 15500 1727096214.00057: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.00066: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.00077: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.00083: done sending task result for task 0afff68d-5257-877d-2da0-000000000024 15500 1727096214.00085: WORKER PROCESS EXITING 15500 1727096214.00089: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.01732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.03252: done with get_vars() 15500 1727096214.03279: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:54 -0400 (0:00:00.948) 0:00:14.076 ****** 15500 1727096214.03341: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15500 1727096214.03343: Creating lock for fedora.linux_system_roles.network_state 15500 1727096214.03614: worker is 1 (out of 1 available) 15500 1727096214.03628: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15500 1727096214.03642: done queuing things up, now waiting for results queue to drain 15500 1727096214.03644: waiting for pending results... 15500 1727096214.03825: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15500 1727096214.03896: in run() - task 0afff68d-5257-877d-2da0-000000000025 15500 1727096214.03909: variable 'ansible_search_path' from source: unknown 15500 1727096214.03912: variable 'ansible_search_path' from source: unknown 15500 1727096214.03940: calling self._execute() 15500 1727096214.04015: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.04020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.04029: variable 'omit' from source: magic vars 15500 1727096214.04322: variable 'ansible_distribution_major_version' from source: facts 15500 1727096214.04331: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096214.04417: variable 'network_state' from source: role '' defaults 15500 1727096214.04425: Evaluated conditional (network_state != {}): False 15500 1727096214.04428: when evaluation is False, skipping this task 15500 1727096214.04431: _execute() done 15500 1727096214.04438: dumping result to json 15500 1727096214.04441: done dumping result, returning 15500 1727096214.04444: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-877d-2da0-000000000025] 15500 1727096214.04449: sending task result for task 0afff68d-5257-877d-2da0-000000000025 15500 1727096214.04537: done sending task result for task 0afff68d-5257-877d-2da0-000000000025 15500 1727096214.04540: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096214.04594: no more pending results, returning what we have 15500 1727096214.04598: results queue empty 15500 1727096214.04599: checking for any_errors_fatal 15500 1727096214.04613: done checking for any_errors_fatal 15500 1727096214.04613: checking for max_fail_percentage 15500 1727096214.04616: done checking for max_fail_percentage 15500 1727096214.04616: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.04617: done checking to see if all hosts have failed 15500 1727096214.04618: getting the remaining hosts for this loop 15500 1727096214.04619: done getting the remaining hosts for this loop 15500 1727096214.04623: getting the next task for host managed_node1 15500 1727096214.04629: done getting next task for host managed_node1 15500 1727096214.04633: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15500 1727096214.04637: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.04650: getting variables 15500 1727096214.04652: in VariableManager get_vars() 15500 1727096214.04690: Calling all_inventory to load vars for managed_node1 15500 1727096214.04694: Calling groups_inventory to load vars for managed_node1 15500 1727096214.04696: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.04704: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.04706: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.04709: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.06331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.07186: done with get_vars() 15500 1727096214.07209: done getting variables 15500 1727096214.07255: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:54 -0400 (0:00:00.039) 0:00:14.115 ****** 15500 1727096214.07280: entering _queue_task() for managed_node1/debug 15500 1727096214.07539: worker is 1 (out of 1 available) 15500 1727096214.07554: exiting _queue_task() for managed_node1/debug 15500 1727096214.07569: done queuing things up, now waiting for results queue to drain 15500 1727096214.07571: waiting for pending results... 15500 1727096214.07741: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15500 1727096214.07813: in run() - task 0afff68d-5257-877d-2da0-000000000026 15500 1727096214.07825: variable 'ansible_search_path' from source: unknown 15500 1727096214.07828: variable 'ansible_search_path' from source: unknown 15500 1727096214.07858: calling self._execute() 15500 1727096214.07935: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.07939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.07949: variable 'omit' from source: magic vars 15500 1727096214.08235: variable 'ansible_distribution_major_version' from source: facts 15500 1727096214.08242: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096214.08249: variable 'omit' from source: magic vars 15500 1727096214.08280: variable 'omit' from source: magic vars 15500 1727096214.08306: variable 'omit' from source: magic vars 15500 1727096214.08341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096214.08372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096214.08389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096214.08402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.08412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.08435: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096214.08438: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.08441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.08517: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096214.08520: Set connection var ansible_pipelining to False 15500 1727096214.08525: Set connection var ansible_timeout to 10 15500 1727096214.08528: Set connection var ansible_shell_type to sh 15500 1727096214.08533: Set connection var ansible_shell_executable to /bin/sh 15500 1727096214.08538: Set connection var ansible_connection to ssh 15500 1727096214.08556: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.08559: variable 'ansible_connection' from source: unknown 15500 1727096214.08570: variable 'ansible_module_compression' from source: unknown 15500 1727096214.08573: variable 'ansible_shell_type' from source: unknown 15500 1727096214.08575: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.08578: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.08580: variable 'ansible_pipelining' from source: unknown 15500 1727096214.08582: variable 'ansible_timeout' from source: unknown 15500 1727096214.08584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.08690: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096214.08699: variable 'omit' from source: magic vars 15500 1727096214.08704: starting attempt loop 15500 1727096214.08707: running the handler 15500 1727096214.08804: variable '__network_connections_result' from source: set_fact 15500 1727096214.08844: handler run complete 15500 1727096214.08857: attempt loop complete, returning result 15500 1727096214.08862: _execute() done 15500 1727096214.08865: dumping result to json 15500 1727096214.08870: done dumping result, returning 15500 1727096214.08882: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-877d-2da0-000000000026] 15500 1727096214.08886: sending task result for task 0afff68d-5257-877d-2da0-000000000026 15500 1727096214.08972: done sending task result for task 0afff68d-5257-877d-2da0-000000000026 15500 1727096214.08975: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3 (not-active)" ] } 15500 1727096214.09042: no more pending results, returning what we have 15500 1727096214.09045: results queue empty 15500 1727096214.09046: checking for any_errors_fatal 15500 1727096214.09053: done checking for any_errors_fatal 15500 1727096214.09054: checking for max_fail_percentage 15500 1727096214.09055: done checking for max_fail_percentage 15500 1727096214.09056: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.09057: done checking to see if all hosts have failed 15500 1727096214.09058: getting the remaining hosts for this loop 15500 1727096214.09059: done getting the remaining hosts for this loop 15500 1727096214.09063: getting the next task for host managed_node1 15500 1727096214.09070: done getting next task for host managed_node1 15500 1727096214.09074: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15500 1727096214.09076: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.09086: getting variables 15500 1727096214.09087: in VariableManager get_vars() 15500 1727096214.09122: Calling all_inventory to load vars for managed_node1 15500 1727096214.09124: Calling groups_inventory to load vars for managed_node1 15500 1727096214.09127: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.09136: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.09139: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.09141: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.09943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.11441: done with get_vars() 15500 1727096214.11477: done getting variables 15500 1727096214.11559: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:54 -0400 (0:00:00.043) 0:00:14.159 ****** 15500 1727096214.11595: entering _queue_task() for managed_node1/debug 15500 1727096214.11894: worker is 1 (out of 1 available) 15500 1727096214.11911: exiting _queue_task() for managed_node1/debug 15500 1727096214.11924: done queuing things up, now waiting for results queue to drain 15500 1727096214.11925: waiting for pending results... 15500 1727096214.12097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15500 1727096214.12164: in run() - task 0afff68d-5257-877d-2da0-000000000027 15500 1727096214.12179: variable 'ansible_search_path' from source: unknown 15500 1727096214.12182: variable 'ansible_search_path' from source: unknown 15500 1727096214.12211: calling self._execute() 15500 1727096214.12288: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.12292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.12302: variable 'omit' from source: magic vars 15500 1727096214.12576: variable 'ansible_distribution_major_version' from source: facts 15500 1727096214.12586: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096214.12594: variable 'omit' from source: magic vars 15500 1727096214.12620: variable 'omit' from source: magic vars 15500 1727096214.12647: variable 'omit' from source: magic vars 15500 1727096214.12682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096214.12712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096214.12729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096214.12742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.12751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.12777: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096214.12780: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.12782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.12855: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096214.12860: Set connection var ansible_pipelining to False 15500 1727096214.12863: Set connection var ansible_timeout to 10 15500 1727096214.12865: Set connection var ansible_shell_type to sh 15500 1727096214.12872: Set connection var ansible_shell_executable to /bin/sh 15500 1727096214.12877: Set connection var ansible_connection to ssh 15500 1727096214.12894: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.12897: variable 'ansible_connection' from source: unknown 15500 1727096214.12900: variable 'ansible_module_compression' from source: unknown 15500 1727096214.12902: variable 'ansible_shell_type' from source: unknown 15500 1727096214.12904: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.12906: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.12910: variable 'ansible_pipelining' from source: unknown 15500 1727096214.12912: variable 'ansible_timeout' from source: unknown 15500 1727096214.12916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.13018: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096214.13028: variable 'omit' from source: magic vars 15500 1727096214.13031: starting attempt loop 15500 1727096214.13035: running the handler 15500 1727096214.13079: variable '__network_connections_result' from source: set_fact 15500 1727096214.13148: variable '__network_connections_result' from source: set_fact 15500 1727096214.13229: handler run complete 15500 1727096214.13247: attempt loop complete, returning result 15500 1727096214.13250: _execute() done 15500 1727096214.13253: dumping result to json 15500 1727096214.13255: done dumping result, returning 15500 1727096214.13265: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-877d-2da0-000000000027] 15500 1727096214.13276: sending task result for task 0afff68d-5257-877d-2da0-000000000027 15500 1727096214.13391: done sending task result for task 0afff68d-5257-877d-2da0-000000000027 15500 1727096214.13394: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 915a44a0-07cf-46e3-a30a-d66b6a6317e3 (not-active)" ] } } 15500 1727096214.13514: no more pending results, returning what we have 15500 1727096214.13517: results queue empty 15500 1727096214.13517: checking for any_errors_fatal 15500 1727096214.13523: done checking for any_errors_fatal 15500 1727096214.13524: checking for max_fail_percentage 15500 1727096214.13525: done checking for max_fail_percentage 15500 1727096214.13526: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.13527: done checking to see if all hosts have failed 15500 1727096214.13528: getting the remaining hosts for this loop 15500 1727096214.13529: done getting the remaining hosts for this loop 15500 1727096214.13532: getting the next task for host managed_node1 15500 1727096214.13537: done getting next task for host managed_node1 15500 1727096214.13540: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15500 1727096214.13544: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.13552: getting variables 15500 1727096214.13553: in VariableManager get_vars() 15500 1727096214.13661: Calling all_inventory to load vars for managed_node1 15500 1727096214.13664: Calling groups_inventory to load vars for managed_node1 15500 1727096214.13670: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.13680: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.13682: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.13689: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.15048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.16415: done with get_vars() 15500 1727096214.16439: done getting variables 15500 1727096214.16489: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:54 -0400 (0:00:00.049) 0:00:14.208 ****** 15500 1727096214.16514: entering _queue_task() for managed_node1/debug 15500 1727096214.16775: worker is 1 (out of 1 available) 15500 1727096214.16790: exiting _queue_task() for managed_node1/debug 15500 1727096214.16803: done queuing things up, now waiting for results queue to drain 15500 1727096214.16805: waiting for pending results... 15500 1727096214.16980: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15500 1727096214.17052: in run() - task 0afff68d-5257-877d-2da0-000000000028 15500 1727096214.17066: variable 'ansible_search_path' from source: unknown 15500 1727096214.17073: variable 'ansible_search_path' from source: unknown 15500 1727096214.17102: calling self._execute() 15500 1727096214.17179: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.17184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.17192: variable 'omit' from source: magic vars 15500 1727096214.17479: variable 'ansible_distribution_major_version' from source: facts 15500 1727096214.17489: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096214.17578: variable 'network_state' from source: role '' defaults 15500 1727096214.17586: Evaluated conditional (network_state != {}): False 15500 1727096214.17590: when evaluation is False, skipping this task 15500 1727096214.17593: _execute() done 15500 1727096214.17595: dumping result to json 15500 1727096214.17598: done dumping result, returning 15500 1727096214.17606: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-877d-2da0-000000000028] 15500 1727096214.17611: sending task result for task 0afff68d-5257-877d-2da0-000000000028 15500 1727096214.17694: done sending task result for task 0afff68d-5257-877d-2da0-000000000028 15500 1727096214.17697: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15500 1727096214.17739: no more pending results, returning what we have 15500 1727096214.17743: results queue empty 15500 1727096214.17743: checking for any_errors_fatal 15500 1727096214.17754: done checking for any_errors_fatal 15500 1727096214.17755: checking for max_fail_percentage 15500 1727096214.17756: done checking for max_fail_percentage 15500 1727096214.17757: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.17758: done checking to see if all hosts have failed 15500 1727096214.17759: getting the remaining hosts for this loop 15500 1727096214.17760: done getting the remaining hosts for this loop 15500 1727096214.17763: getting the next task for host managed_node1 15500 1727096214.17772: done getting next task for host managed_node1 15500 1727096214.17775: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15500 1727096214.17778: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.17791: getting variables 15500 1727096214.17792: in VariableManager get_vars() 15500 1727096214.17828: Calling all_inventory to load vars for managed_node1 15500 1727096214.17830: Calling groups_inventory to load vars for managed_node1 15500 1727096214.17833: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.17843: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.17846: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.17848: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.19037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.19976: done with get_vars() 15500 1727096214.20000: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:54 -0400 (0:00:00.035) 0:00:14.243 ****** 15500 1727096214.20072: entering _queue_task() for managed_node1/ping 15500 1727096214.20073: Creating lock for ping 15500 1727096214.20340: worker is 1 (out of 1 available) 15500 1727096214.20355: exiting _queue_task() for managed_node1/ping 15500 1727096214.20370: done queuing things up, now waiting for results queue to drain 15500 1727096214.20372: waiting for pending results... 15500 1727096214.20551: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15500 1727096214.20626: in run() - task 0afff68d-5257-877d-2da0-000000000029 15500 1727096214.20638: variable 'ansible_search_path' from source: unknown 15500 1727096214.20641: variable 'ansible_search_path' from source: unknown 15500 1727096214.20674: calling self._execute() 15500 1727096214.20750: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.20753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.20765: variable 'omit' from source: magic vars 15500 1727096214.21052: variable 'ansible_distribution_major_version' from source: facts 15500 1727096214.21063: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096214.21070: variable 'omit' from source: magic vars 15500 1727096214.21099: variable 'omit' from source: magic vars 15500 1727096214.21125: variable 'omit' from source: magic vars 15500 1727096214.21160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096214.21191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096214.21209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096214.21221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.21231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.21256: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096214.21265: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.21273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.21339: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096214.21342: Set connection var ansible_pipelining to False 15500 1727096214.21348: Set connection var ansible_timeout to 10 15500 1727096214.21352: Set connection var ansible_shell_type to sh 15500 1727096214.21354: Set connection var ansible_shell_executable to /bin/sh 15500 1727096214.21363: Set connection var ansible_connection to ssh 15500 1727096214.21384: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.21387: variable 'ansible_connection' from source: unknown 15500 1727096214.21390: variable 'ansible_module_compression' from source: unknown 15500 1727096214.21393: variable 'ansible_shell_type' from source: unknown 15500 1727096214.21395: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.21397: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.21400: variable 'ansible_pipelining' from source: unknown 15500 1727096214.21404: variable 'ansible_timeout' from source: unknown 15500 1727096214.21407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.21628: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096214.21675: variable 'omit' from source: magic vars 15500 1727096214.21680: starting attempt loop 15500 1727096214.21686: running the handler 15500 1727096214.21688: _low_level_execute_command(): starting 15500 1727096214.21690: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096214.22474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096214.22483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096214.22487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.22532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096214.22539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096214.22542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096214.22617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096214.24346: stdout chunk (state=3): >>>/root <<< 15500 1727096214.24528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096214.24534: stdout chunk (state=3): >>><<< 15500 1727096214.24538: stderr chunk (state=3): >>><<< 15500 1727096214.24581: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096214.24586: _low_level_execute_command(): starting 15500 1727096214.24593: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458 `" && echo ansible-tmp-1727096214.2457356-16149-50838800650458="` echo /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458 `" ) && sleep 0' 15500 1727096214.25152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096214.25155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096214.25160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096214.25177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096214.25180: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.25226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096214.25237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096214.25329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096214.27261: stdout chunk (state=3): >>>ansible-tmp-1727096214.2457356-16149-50838800650458=/root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458 <<< 15500 1727096214.27373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096214.27413: stderr chunk (state=3): >>><<< 15500 1727096214.27416: stdout chunk (state=3): >>><<< 15500 1727096214.27434: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096214.2457356-16149-50838800650458=/root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096214.27478: variable 'ansible_module_compression' from source: unknown 15500 1727096214.27517: ANSIBALLZ: Using lock for ping 15500 1727096214.27520: ANSIBALLZ: Acquiring lock 15500 1727096214.27523: ANSIBALLZ: Lock acquired: 140712176860384 15500 1727096214.27525: ANSIBALLZ: Creating module 15500 1727096214.37361: ANSIBALLZ: Writing module into payload 15500 1727096214.37421: ANSIBALLZ: Writing module 15500 1727096214.37455: ANSIBALLZ: Renaming module 15500 1727096214.37461: ANSIBALLZ: Done creating module 15500 1727096214.37486: variable 'ansible_facts' from source: unknown 15500 1727096214.37557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/AnsiballZ_ping.py 15500 1727096214.37756: Sending initial data 15500 1727096214.37759: Sent initial data (152 bytes) 15500 1727096214.38341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096214.38346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.38349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096214.38352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096214.38354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.38412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096214.38415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096214.38432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096214.38514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096214.40208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096214.40281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096214.40346: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpd97m1947 /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/AnsiballZ_ping.py <<< 15500 1727096214.40350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/AnsiballZ_ping.py" <<< 15500 1727096214.40432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpd97m1947" to remote "/root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/AnsiballZ_ping.py" <<< 15500 1727096214.41037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096214.41088: stderr chunk (state=3): >>><<< 15500 1727096214.41091: stdout chunk (state=3): >>><<< 15500 1727096214.41135: done transferring module to remote 15500 1727096214.41142: _low_level_execute_command(): starting 15500 1727096214.41147: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/ /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/AnsiballZ_ping.py && sleep 0' 15500 1727096214.41798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096214.41802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.41850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096214.41872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096214.41912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096214.43733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096214.43775: stderr chunk (state=3): >>><<< 15500 1727096214.43779: stdout chunk (state=3): >>><<< 15500 1727096214.43809: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096214.43813: _low_level_execute_command(): starting 15500 1727096214.43815: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/AnsiballZ_ping.py && sleep 0' 15500 1727096214.44424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096214.44428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096214.44430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.44432: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096214.44434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.44502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096214.44508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096214.44514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096214.44582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096214.59901: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15500 1727096214.61194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096214.61222: stdout chunk (state=3): >>><<< 15500 1727096214.61234: stderr chunk (state=3): >>><<< 15500 1727096214.61255: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096214.61285: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096214.61301: _low_level_execute_command(): starting 15500 1727096214.61387: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096214.2457356-16149-50838800650458/ > /dev/null 2>&1 && sleep 0' 15500 1727096214.61943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096214.62038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096214.62051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096214.62089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096214.62155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096214.64123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096214.64160: stderr chunk (state=3): >>><<< 15500 1727096214.64165: stdout chunk (state=3): >>><<< 15500 1727096214.64180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096214.64188: handler run complete 15500 1727096214.64199: attempt loop complete, returning result 15500 1727096214.64202: _execute() done 15500 1727096214.64204: dumping result to json 15500 1727096214.64206: done dumping result, returning 15500 1727096214.64215: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-877d-2da0-000000000029] 15500 1727096214.64220: sending task result for task 0afff68d-5257-877d-2da0-000000000029 15500 1727096214.64319: done sending task result for task 0afff68d-5257-877d-2da0-000000000029 15500 1727096214.64323: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15500 1727096214.64385: no more pending results, returning what we have 15500 1727096214.64389: results queue empty 15500 1727096214.64390: checking for any_errors_fatal 15500 1727096214.64397: done checking for any_errors_fatal 15500 1727096214.64398: checking for max_fail_percentage 15500 1727096214.64399: done checking for max_fail_percentage 15500 1727096214.64400: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.64401: done checking to see if all hosts have failed 15500 1727096214.64402: getting the remaining hosts for this loop 15500 1727096214.64403: done getting the remaining hosts for this loop 15500 1727096214.64406: getting the next task for host managed_node1 15500 1727096214.64413: done getting next task for host managed_node1 15500 1727096214.64416: ^ task is: TASK: meta (role_complete) 15500 1727096214.64417: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.64425: getting variables 15500 1727096214.64427: in VariableManager get_vars() 15500 1727096214.64462: Calling all_inventory to load vars for managed_node1 15500 1727096214.64465: Calling groups_inventory to load vars for managed_node1 15500 1727096214.64469: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.64486: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.64489: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.64492: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.65949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.69280: done with get_vars() 15500 1727096214.69313: done getting variables 15500 1727096214.69519: done queuing things up, now waiting for results queue to drain 15500 1727096214.69521: results queue empty 15500 1727096214.69522: checking for any_errors_fatal 15500 1727096214.69525: done checking for any_errors_fatal 15500 1727096214.69526: checking for max_fail_percentage 15500 1727096214.69527: done checking for max_fail_percentage 15500 1727096214.69527: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.69528: done checking to see if all hosts have failed 15500 1727096214.69529: getting the remaining hosts for this loop 15500 1727096214.69530: done getting the remaining hosts for this loop 15500 1727096214.69533: getting the next task for host managed_node1 15500 1727096214.69537: done getting next task for host managed_node1 15500 1727096214.69538: ^ task is: TASK: meta (flush_handlers) 15500 1727096214.69540: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.69542: getting variables 15500 1727096214.69543: in VariableManager get_vars() 15500 1727096214.69556: Calling all_inventory to load vars for managed_node1 15500 1727096214.69559: Calling groups_inventory to load vars for managed_node1 15500 1727096214.69561: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.69683: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.69686: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.69690: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.72235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.75854: done with get_vars() 15500 1727096214.76001: done getting variables 15500 1727096214.76054: in VariableManager get_vars() 15500 1727096214.76070: Calling all_inventory to load vars for managed_node1 15500 1727096214.76073: Calling groups_inventory to load vars for managed_node1 15500 1727096214.76169: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.76177: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.76180: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.76183: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.78708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.81960: done with get_vars() 15500 1727096214.82117: done queuing things up, now waiting for results queue to drain 15500 1727096214.82120: results queue empty 15500 1727096214.82121: checking for any_errors_fatal 15500 1727096214.82123: done checking for any_errors_fatal 15500 1727096214.82123: checking for max_fail_percentage 15500 1727096214.82124: done checking for max_fail_percentage 15500 1727096214.82125: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.82126: done checking to see if all hosts have failed 15500 1727096214.82127: getting the remaining hosts for this loop 15500 1727096214.82128: done getting the remaining hosts for this loop 15500 1727096214.82130: getting the next task for host managed_node1 15500 1727096214.82134: done getting next task for host managed_node1 15500 1727096214.82136: ^ task is: TASK: meta (flush_handlers) 15500 1727096214.82137: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.82140: getting variables 15500 1727096214.82141: in VariableManager get_vars() 15500 1727096214.82154: Calling all_inventory to load vars for managed_node1 15500 1727096214.82157: Calling groups_inventory to load vars for managed_node1 15500 1727096214.82159: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.82164: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.82167: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.82276: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.84898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.86818: done with get_vars() 15500 1727096214.86846: done getting variables 15500 1727096214.86909: in VariableManager get_vars() 15500 1727096214.86924: Calling all_inventory to load vars for managed_node1 15500 1727096214.86927: Calling groups_inventory to load vars for managed_node1 15500 1727096214.86929: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.86934: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.86937: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.86939: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.88140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.90530: done with get_vars() 15500 1727096214.90681: done queuing things up, now waiting for results queue to drain 15500 1727096214.90684: results queue empty 15500 1727096214.90684: checking for any_errors_fatal 15500 1727096214.90686: done checking for any_errors_fatal 15500 1727096214.90687: checking for max_fail_percentage 15500 1727096214.90688: done checking for max_fail_percentage 15500 1727096214.90688: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.90689: done checking to see if all hosts have failed 15500 1727096214.90690: getting the remaining hosts for this loop 15500 1727096214.90691: done getting the remaining hosts for this loop 15500 1727096214.90693: getting the next task for host managed_node1 15500 1727096214.90697: done getting next task for host managed_node1 15500 1727096214.90698: ^ task is: None 15500 1727096214.90699: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.90700: done queuing things up, now waiting for results queue to drain 15500 1727096214.90701: results queue empty 15500 1727096214.90702: checking for any_errors_fatal 15500 1727096214.90703: done checking for any_errors_fatal 15500 1727096214.90703: checking for max_fail_percentage 15500 1727096214.90704: done checking for max_fail_percentage 15500 1727096214.90705: checking to see if all hosts have failed and the running result is not ok 15500 1727096214.90705: done checking to see if all hosts have failed 15500 1727096214.90707: getting the next task for host managed_node1 15500 1727096214.90709: done getting next task for host managed_node1 15500 1727096214.90709: ^ task is: None 15500 1727096214.90710: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.90984: in VariableManager get_vars() 15500 1727096214.91010: done with get_vars() 15500 1727096214.91018: in VariableManager get_vars() 15500 1727096214.91028: done with get_vars() 15500 1727096214.91034: variable 'omit' from source: magic vars 15500 1727096214.91328: variable 'task' from source: play vars 15500 1727096214.91360: in VariableManager get_vars() 15500 1727096214.91375: done with get_vars() 15500 1727096214.91394: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 15500 1727096214.91805: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096214.91987: getting the remaining hosts for this loop 15500 1727096214.91989: done getting the remaining hosts for this loop 15500 1727096214.91992: getting the next task for host managed_node1 15500 1727096214.91995: done getting next task for host managed_node1 15500 1727096214.91997: ^ task is: TASK: Gathering Facts 15500 1727096214.91998: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096214.92000: getting variables 15500 1727096214.92001: in VariableManager get_vars() 15500 1727096214.92010: Calling all_inventory to load vars for managed_node1 15500 1727096214.92012: Calling groups_inventory to load vars for managed_node1 15500 1727096214.92014: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096214.92020: Calling all_plugins_play to load vars for managed_node1 15500 1727096214.92022: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096214.92025: Calling groups_plugins_play to load vars for managed_node1 15500 1727096214.94721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096214.97354: done with get_vars() 15500 1727096214.97389: done getting variables 15500 1727096214.97436: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Monday 23 September 2024 08:56:54 -0400 (0:00:00.773) 0:00:15.017 ****** 15500 1727096214.97466: entering _queue_task() for managed_node1/gather_facts 15500 1727096214.97813: worker is 1 (out of 1 available) 15500 1727096214.97824: exiting _queue_task() for managed_node1/gather_facts 15500 1727096214.97836: done queuing things up, now waiting for results queue to drain 15500 1727096214.97837: waiting for pending results... 15500 1727096214.98121: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096214.98203: in run() - task 0afff68d-5257-877d-2da0-000000000219 15500 1727096214.98213: variable 'ansible_search_path' from source: unknown 15500 1727096214.98245: calling self._execute() 15500 1727096214.98321: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.98340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.98395: variable 'omit' from source: magic vars 15500 1727096214.98728: variable 'ansible_distribution_major_version' from source: facts 15500 1727096214.98972: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096214.98976: variable 'omit' from source: magic vars 15500 1727096214.98978: variable 'omit' from source: magic vars 15500 1727096214.98980: variable 'omit' from source: magic vars 15500 1727096214.98982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096214.98989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096214.98992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096214.98994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.98997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096214.98999: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096214.99002: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.99004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.99104: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096214.99115: Set connection var ansible_pipelining to False 15500 1727096214.99124: Set connection var ansible_timeout to 10 15500 1727096214.99132: Set connection var ansible_shell_type to sh 15500 1727096214.99148: Set connection var ansible_shell_executable to /bin/sh 15500 1727096214.99152: Set connection var ansible_connection to ssh 15500 1727096214.99175: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.99178: variable 'ansible_connection' from source: unknown 15500 1727096214.99184: variable 'ansible_module_compression' from source: unknown 15500 1727096214.99187: variable 'ansible_shell_type' from source: unknown 15500 1727096214.99190: variable 'ansible_shell_executable' from source: unknown 15500 1727096214.99192: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096214.99194: variable 'ansible_pipelining' from source: unknown 15500 1727096214.99229: variable 'ansible_timeout' from source: unknown 15500 1727096214.99235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096214.99573: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096214.99576: variable 'omit' from source: magic vars 15500 1727096214.99579: starting attempt loop 15500 1727096214.99582: running the handler 15500 1727096214.99584: variable 'ansible_facts' from source: unknown 15500 1727096214.99587: _low_level_execute_command(): starting 15500 1727096214.99590: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096215.00597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096215.00619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096215.00636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096215.00656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096215.00677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096215.00783: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096215.00804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096215.00824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096215.00943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096215.02741: stdout chunk (state=3): >>>/root <<< 15500 1727096215.02827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096215.02831: stdout chunk (state=3): >>><<< 15500 1727096215.02833: stderr chunk (state=3): >>><<< 15500 1727096215.02866: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096215.02889: _low_level_execute_command(): starting 15500 1727096215.02901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859 `" && echo ansible-tmp-1727096215.0287552-16170-207980369580859="` echo /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859 `" ) && sleep 0' 15500 1727096215.03470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096215.03481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096215.03491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096215.03527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096215.03550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096215.03603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096215.03606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096215.03612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096215.03689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096215.06177: stdout chunk (state=3): >>>ansible-tmp-1727096215.0287552-16170-207980369580859=/root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859 <<< 15500 1727096215.06182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096215.06185: stdout chunk (state=3): >>><<< 15500 1727096215.06187: stderr chunk (state=3): >>><<< 15500 1727096215.06191: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096215.0287552-16170-207980369580859=/root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096215.06194: variable 'ansible_module_compression' from source: unknown 15500 1727096215.06288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096215.06462: variable 'ansible_facts' from source: unknown 15500 1727096215.06754: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/AnsiballZ_setup.py 15500 1727096215.07195: Sending initial data 15500 1727096215.07211: Sent initial data (154 bytes) 15500 1727096215.09711: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096215.09718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096215.09789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096215.09905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096215.09984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096215.11639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096215.11714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096215.11785: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp4yzmn0yj /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/AnsiballZ_setup.py <<< 15500 1727096215.11789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/AnsiballZ_setup.py" <<< 15500 1727096215.11871: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp4yzmn0yj" to remote "/root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/AnsiballZ_setup.py" <<< 15500 1727096215.13535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096215.13636: stderr chunk (state=3): >>><<< 15500 1727096215.13639: stdout chunk (state=3): >>><<< 15500 1727096215.13642: done transferring module to remote 15500 1727096215.13644: _low_level_execute_command(): starting 15500 1727096215.13648: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/ /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/AnsiballZ_setup.py && sleep 0' 15500 1727096215.14417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096215.14421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096215.14423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096215.14428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096215.14431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096215.14496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096215.14510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096215.14608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096215.16455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096215.16489: stderr chunk (state=3): >>><<< 15500 1727096215.16492: stdout chunk (state=3): >>><<< 15500 1727096215.16508: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096215.16511: _low_level_execute_command(): starting 15500 1727096215.16516: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/AnsiballZ_setup.py && sleep 0' 15500 1727096215.17223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096215.17261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096215.17266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096215.17329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096215.17374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096215.17500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096215.82950: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "0<<< 15500 1727096215.82978: stdout chunk (state=3): >>>8", "minute": "56", "second": "55", "epoch": "1727096215", "epoch_int": "1727096215", "date": "2024-09-23", "time": "08:56:55", "iso8601_micro": "2024-09-23T12:56:55.458275Z", "iso8601": "2024-09-23T12:56:55Z", "iso8601_basic": "20240923T085655458275", "iso8601_basic_short": "20240923T085655", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.494140625, "5m": 0.33056640625, "15m": 0.15234375}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1a:3f:d0:99:f4:7d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2946, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 585, "free": 2946}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 368, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797597184, "block_size": 4096, "block_total": 65519099, "block_available": 63915429, "block_used": 1603670, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096215.85477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096215.85482: stdout chunk (state=3): >>><<< 15500 1727096215.85484: stderr chunk (state=3): >>><<< 15500 1727096215.85488: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "55", "epoch": "1727096215", "epoch_int": "1727096215", "date": "2024-09-23", "time": "08:56:55", "iso8601_micro": "2024-09-23T12:56:55.458275Z", "iso8601": "2024-09-23T12:56:55Z", "iso8601_basic": "20240923T085655458275", "iso8601_basic_short": "20240923T085655", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.494140625, "5m": 0.33056640625, "15m": 0.15234375}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1a:3f:d0:99:f4:7d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2946, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 585, "free": 2946}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 368, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797597184, "block_size": 4096, "block_total": 65519099, "block_available": 63915429, "block_used": 1603670, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096215.85766: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096215.85804: _low_level_execute_command(): starting 15500 1727096215.85813: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096215.0287552-16170-207980369580859/ > /dev/null 2>&1 && sleep 0' 15500 1727096215.86578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096215.86605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096215.86704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096215.88654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096215.88719: stderr chunk (state=3): >>><<< 15500 1727096215.88747: stdout chunk (state=3): >>><<< 15500 1727096215.88774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096215.88789: handler run complete 15500 1727096215.88941: variable 'ansible_facts' from source: unknown 15500 1727096215.89063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096215.89410: variable 'ansible_facts' from source: unknown 15500 1727096215.89517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096215.89669: attempt loop complete, returning result 15500 1727096215.89680: _execute() done 15500 1727096215.89688: dumping result to json 15500 1727096215.89735: done dumping result, returning 15500 1727096215.89746: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-000000000219] 15500 1727096215.89754: sending task result for task 0afff68d-5257-877d-2da0-000000000219 15500 1727096215.90383: done sending task result for task 0afff68d-5257-877d-2da0-000000000219 15500 1727096215.90387: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096215.90797: no more pending results, returning what we have 15500 1727096215.90800: results queue empty 15500 1727096215.90801: checking for any_errors_fatal 15500 1727096215.90809: done checking for any_errors_fatal 15500 1727096215.90810: checking for max_fail_percentage 15500 1727096215.90812: done checking for max_fail_percentage 15500 1727096215.90812: checking to see if all hosts have failed and the running result is not ok 15500 1727096215.90813: done checking to see if all hosts have failed 15500 1727096215.90814: getting the remaining hosts for this loop 15500 1727096215.90815: done getting the remaining hosts for this loop 15500 1727096215.90818: getting the next task for host managed_node1 15500 1727096215.90823: done getting next task for host managed_node1 15500 1727096215.90825: ^ task is: TASK: meta (flush_handlers) 15500 1727096215.90835: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096215.90839: getting variables 15500 1727096215.90840: in VariableManager get_vars() 15500 1727096215.90871: Calling all_inventory to load vars for managed_node1 15500 1727096215.90874: Calling groups_inventory to load vars for managed_node1 15500 1727096215.90878: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096215.90887: Calling all_plugins_play to load vars for managed_node1 15500 1727096215.90897: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096215.90902: Calling groups_plugins_play to load vars for managed_node1 15500 1727096215.97096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096215.98674: done with get_vars() 15500 1727096215.98702: done getting variables 15500 1727096215.98764: in VariableManager get_vars() 15500 1727096215.98776: Calling all_inventory to load vars for managed_node1 15500 1727096215.98779: Calling groups_inventory to load vars for managed_node1 15500 1727096215.98781: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096215.98787: Calling all_plugins_play to load vars for managed_node1 15500 1727096215.98789: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096215.98792: Calling groups_plugins_play to load vars for managed_node1 15500 1727096215.99914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.01427: done with get_vars() 15500 1727096216.01459: done queuing things up, now waiting for results queue to drain 15500 1727096216.01461: results queue empty 15500 1727096216.01462: checking for any_errors_fatal 15500 1727096216.01466: done checking for any_errors_fatal 15500 1727096216.01468: checking for max_fail_percentage 15500 1727096216.01469: done checking for max_fail_percentage 15500 1727096216.01470: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.01471: done checking to see if all hosts have failed 15500 1727096216.01471: getting the remaining hosts for this loop 15500 1727096216.01477: done getting the remaining hosts for this loop 15500 1727096216.01480: getting the next task for host managed_node1 15500 1727096216.01484: done getting next task for host managed_node1 15500 1727096216.01487: ^ task is: TASK: Include the task '{{ task }}' 15500 1727096216.01488: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.01490: getting variables 15500 1727096216.01491: in VariableManager get_vars() 15500 1727096216.01501: Calling all_inventory to load vars for managed_node1 15500 1727096216.01503: Calling groups_inventory to load vars for managed_node1 15500 1727096216.01505: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.01511: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.01513: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.01516: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.02727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.04328: done with get_vars() 15500 1727096216.04353: done getting variables 15500 1727096216.04512: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Monday 23 September 2024 08:56:56 -0400 (0:00:01.070) 0:00:16.088 ****** 15500 1727096216.04537: entering _queue_task() for managed_node1/include_tasks 15500 1727096216.04895: worker is 1 (out of 1 available) 15500 1727096216.04910: exiting _queue_task() for managed_node1/include_tasks 15500 1727096216.04923: done queuing things up, now waiting for results queue to drain 15500 1727096216.04925: waiting for pending results... 15500 1727096216.05289: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' 15500 1727096216.05426: in run() - task 0afff68d-5257-877d-2da0-00000000002d 15500 1727096216.05446: variable 'ansible_search_path' from source: unknown 15500 1727096216.05519: calling self._execute() 15500 1727096216.05627: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.05631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.05635: variable 'omit' from source: magic vars 15500 1727096216.06029: variable 'ansible_distribution_major_version' from source: facts 15500 1727096216.06061: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096216.06065: variable 'task' from source: play vars 15500 1727096216.06171: variable 'task' from source: play vars 15500 1727096216.06175: _execute() done 15500 1727096216.06178: dumping result to json 15500 1727096216.06181: done dumping result, returning 15500 1727096216.06191: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' [0afff68d-5257-877d-2da0-00000000002d] 15500 1727096216.06277: sending task result for task 0afff68d-5257-877d-2da0-00000000002d 15500 1727096216.06354: done sending task result for task 0afff68d-5257-877d-2da0-00000000002d 15500 1727096216.06357: WORKER PROCESS EXITING 15500 1727096216.06406: no more pending results, returning what we have 15500 1727096216.06411: in VariableManager get_vars() 15500 1727096216.06449: Calling all_inventory to load vars for managed_node1 15500 1727096216.06451: Calling groups_inventory to load vars for managed_node1 15500 1727096216.06455: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.06471: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.06475: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.06478: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.08063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.09651: done with get_vars() 15500 1727096216.09677: variable 'ansible_search_path' from source: unknown 15500 1727096216.09698: we have included files to process 15500 1727096216.09700: generating all_blocks data 15500 1727096216.09701: done generating all_blocks data 15500 1727096216.09702: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15500 1727096216.09703: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15500 1727096216.09706: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15500 1727096216.09876: in VariableManager get_vars() 15500 1727096216.09894: done with get_vars() 15500 1727096216.10011: done processing included file 15500 1727096216.10013: iterating over new_blocks loaded from include file 15500 1727096216.10014: in VariableManager get_vars() 15500 1727096216.10031: done with get_vars() 15500 1727096216.10033: filtering new block on tags 15500 1727096216.10051: done filtering new block on tags 15500 1727096216.10053: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 15500 1727096216.10058: extending task lists for all hosts with included blocks 15500 1727096216.10090: done extending task lists 15500 1727096216.10092: done processing included files 15500 1727096216.10092: results queue empty 15500 1727096216.10093: checking for any_errors_fatal 15500 1727096216.10095: done checking for any_errors_fatal 15500 1727096216.10095: checking for max_fail_percentage 15500 1727096216.10096: done checking for max_fail_percentage 15500 1727096216.10097: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.10098: done checking to see if all hosts have failed 15500 1727096216.10099: getting the remaining hosts for this loop 15500 1727096216.10100: done getting the remaining hosts for this loop 15500 1727096216.10102: getting the next task for host managed_node1 15500 1727096216.10106: done getting next task for host managed_node1 15500 1727096216.10108: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15500 1727096216.10111: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.10113: getting variables 15500 1727096216.10114: in VariableManager get_vars() 15500 1727096216.10122: Calling all_inventory to load vars for managed_node1 15500 1727096216.10124: Calling groups_inventory to load vars for managed_node1 15500 1727096216.10126: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.10136: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.10139: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.10142: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.11386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.13021: done with get_vars() 15500 1727096216.13051: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:56:56 -0400 (0:00:00.085) 0:00:16.174 ****** 15500 1727096216.13138: entering _queue_task() for managed_node1/include_tasks 15500 1727096216.13510: worker is 1 (out of 1 available) 15500 1727096216.13523: exiting _queue_task() for managed_node1/include_tasks 15500 1727096216.13536: done queuing things up, now waiting for results queue to drain 15500 1727096216.13538: waiting for pending results... 15500 1727096216.14387: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15500 1727096216.14395: in run() - task 0afff68d-5257-877d-2da0-00000000022a 15500 1727096216.14399: variable 'ansible_search_path' from source: unknown 15500 1727096216.14403: variable 'ansible_search_path' from source: unknown 15500 1727096216.14776: calling self._execute() 15500 1727096216.14781: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.14785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.14789: variable 'omit' from source: magic vars 15500 1727096216.15775: variable 'ansible_distribution_major_version' from source: facts 15500 1727096216.15780: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096216.15783: _execute() done 15500 1727096216.15786: dumping result to json 15500 1727096216.15788: done dumping result, returning 15500 1727096216.15800: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-877d-2da0-00000000022a] 15500 1727096216.15810: sending task result for task 0afff68d-5257-877d-2da0-00000000022a 15500 1727096216.15933: done sending task result for task 0afff68d-5257-877d-2da0-00000000022a 15500 1727096216.15965: no more pending results, returning what we have 15500 1727096216.15972: in VariableManager get_vars() 15500 1727096216.16007: Calling all_inventory to load vars for managed_node1 15500 1727096216.16010: Calling groups_inventory to load vars for managed_node1 15500 1727096216.16013: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.16027: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.16029: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.16031: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.16747: WORKER PROCESS EXITING 15500 1727096216.17981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.19717: done with get_vars() 15500 1727096216.19739: variable 'ansible_search_path' from source: unknown 15500 1727096216.19741: variable 'ansible_search_path' from source: unknown 15500 1727096216.19753: variable 'task' from source: play vars 15500 1727096216.19876: variable 'task' from source: play vars 15500 1727096216.19911: we have included files to process 15500 1727096216.19912: generating all_blocks data 15500 1727096216.19914: done generating all_blocks data 15500 1727096216.19916: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096216.19917: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096216.19920: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096216.20369: done processing included file 15500 1727096216.20371: iterating over new_blocks loaded from include file 15500 1727096216.20373: in VariableManager get_vars() 15500 1727096216.20387: done with get_vars() 15500 1727096216.20389: filtering new block on tags 15500 1727096216.20404: done filtering new block on tags 15500 1727096216.20407: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15500 1727096216.20412: extending task lists for all hosts with included blocks 15500 1727096216.20517: done extending task lists 15500 1727096216.20518: done processing included files 15500 1727096216.20519: results queue empty 15500 1727096216.20520: checking for any_errors_fatal 15500 1727096216.20523: done checking for any_errors_fatal 15500 1727096216.20524: checking for max_fail_percentage 15500 1727096216.20525: done checking for max_fail_percentage 15500 1727096216.20526: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.20527: done checking to see if all hosts have failed 15500 1727096216.20527: getting the remaining hosts for this loop 15500 1727096216.20529: done getting the remaining hosts for this loop 15500 1727096216.20531: getting the next task for host managed_node1 15500 1727096216.20534: done getting next task for host managed_node1 15500 1727096216.20536: ^ task is: TASK: Get stat for interface {{ interface }} 15500 1727096216.20539: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.20541: getting variables 15500 1727096216.20542: in VariableManager get_vars() 15500 1727096216.20551: Calling all_inventory to load vars for managed_node1 15500 1727096216.20553: Calling groups_inventory to load vars for managed_node1 15500 1727096216.20555: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.20563: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.20566: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.20572: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.21742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.23433: done with get_vars() 15500 1727096216.23468: done getting variables 15500 1727096216.23603: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:56:56 -0400 (0:00:00.104) 0:00:16.279 ****** 15500 1727096216.23633: entering _queue_task() for managed_node1/stat 15500 1727096216.23998: worker is 1 (out of 1 available) 15500 1727096216.24009: exiting _queue_task() for managed_node1/stat 15500 1727096216.24022: done queuing things up, now waiting for results queue to drain 15500 1727096216.24023: waiting for pending results... 15500 1727096216.24302: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15500 1727096216.24434: in run() - task 0afff68d-5257-877d-2da0-000000000235 15500 1727096216.24454: variable 'ansible_search_path' from source: unknown 15500 1727096216.24464: variable 'ansible_search_path' from source: unknown 15500 1727096216.24514: calling self._execute() 15500 1727096216.24613: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.24625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.24642: variable 'omit' from source: magic vars 15500 1727096216.25029: variable 'ansible_distribution_major_version' from source: facts 15500 1727096216.25050: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096216.25066: variable 'omit' from source: magic vars 15500 1727096216.25118: variable 'omit' from source: magic vars 15500 1727096216.25225: variable 'interface' from source: set_fact 15500 1727096216.25251: variable 'omit' from source: magic vars 15500 1727096216.25367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096216.25372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096216.25377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096216.25399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096216.25416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096216.25453: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096216.25466: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.25482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.25595: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096216.25606: Set connection var ansible_pipelining to False 15500 1727096216.25617: Set connection var ansible_timeout to 10 15500 1727096216.25624: Set connection var ansible_shell_type to sh 15500 1727096216.25634: Set connection var ansible_shell_executable to /bin/sh 15500 1727096216.25692: Set connection var ansible_connection to ssh 15500 1727096216.25695: variable 'ansible_shell_executable' from source: unknown 15500 1727096216.25698: variable 'ansible_connection' from source: unknown 15500 1727096216.25700: variable 'ansible_module_compression' from source: unknown 15500 1727096216.25703: variable 'ansible_shell_type' from source: unknown 15500 1727096216.25705: variable 'ansible_shell_executable' from source: unknown 15500 1727096216.25708: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.25711: variable 'ansible_pipelining' from source: unknown 15500 1727096216.25720: variable 'ansible_timeout' from source: unknown 15500 1727096216.25729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.25954: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096216.26017: variable 'omit' from source: magic vars 15500 1727096216.26020: starting attempt loop 15500 1727096216.26023: running the handler 15500 1727096216.26026: _low_level_execute_command(): starting 15500 1727096216.26029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096216.26766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096216.26894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.26943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.27026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.28765: stdout chunk (state=3): >>>/root <<< 15500 1727096216.29098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.29192: stderr chunk (state=3): >>><<< 15500 1727096216.29196: stdout chunk (state=3): >>><<< 15500 1727096216.29199: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096216.29202: _low_level_execute_command(): starting 15500 1727096216.29205: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080 `" && echo ansible-tmp-1727096216.291536-16216-90159458144080="` echo /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080 `" ) && sleep 0' 15500 1727096216.29855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096216.29871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.29966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.30002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.30018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.30116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.32101: stdout chunk (state=3): >>>ansible-tmp-1727096216.291536-16216-90159458144080=/root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080 <<< 15500 1727096216.32329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.32575: stdout chunk (state=3): >>><<< 15500 1727096216.32578: stderr chunk (state=3): >>><<< 15500 1727096216.32581: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096216.291536-16216-90159458144080=/root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096216.32584: variable 'ansible_module_compression' from source: unknown 15500 1727096216.32630: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15500 1727096216.32683: variable 'ansible_facts' from source: unknown 15500 1727096216.32776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/AnsiballZ_stat.py 15500 1727096216.33019: Sending initial data 15500 1727096216.33022: Sent initial data (151 bytes) 15500 1727096216.33549: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096216.33666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.33691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.33705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.33802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.35416: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096216.35441: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096216.35532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096216.35607: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpnhyreqen /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/AnsiballZ_stat.py <<< 15500 1727096216.35625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/AnsiballZ_stat.py" <<< 15500 1727096216.35677: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpnhyreqen" to remote "/root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/AnsiballZ_stat.py" <<< 15500 1727096216.36677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.36681: stdout chunk (state=3): >>><<< 15500 1727096216.36684: stderr chunk (state=3): >>><<< 15500 1727096216.36693: done transferring module to remote 15500 1727096216.36708: _low_level_execute_command(): starting 15500 1727096216.36717: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/ /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/AnsiballZ_stat.py && sleep 0' 15500 1727096216.37360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096216.37376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096216.37391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096216.37450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.37512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.37538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.37560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.37670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.39616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.39620: stdout chunk (state=3): >>><<< 15500 1727096216.39628: stderr chunk (state=3): >>><<< 15500 1727096216.39732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096216.39736: _low_level_execute_command(): starting 15500 1727096216.39739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/AnsiballZ_stat.py && sleep 0' 15500 1727096216.40388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.40409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.40527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.56559: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27720, "dev": 23, "nlink": 1, "atime": 1727096213.8800035, "mtime": 1727096213.8800035, "ctime": 1727096213.8800035, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15500 1727096216.58075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096216.58080: stdout chunk (state=3): >>><<< 15500 1727096216.58082: stderr chunk (state=3): >>><<< 15500 1727096216.58086: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27720, "dev": 23, "nlink": 1, "atime": 1727096213.8800035, "mtime": 1727096213.8800035, "ctime": 1727096213.8800035, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096216.58105: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096216.58121: _low_level_execute_command(): starting 15500 1727096216.58130: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096216.291536-16216-90159458144080/ > /dev/null 2>&1 && sleep 0' 15500 1727096216.58793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096216.58810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096216.58875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.58937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.58955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.58987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.59090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.60962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.60996: stderr chunk (state=3): >>><<< 15500 1727096216.60999: stdout chunk (state=3): >>><<< 15500 1727096216.61010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096216.61042: handler run complete 15500 1727096216.61056: attempt loop complete, returning result 15500 1727096216.61062: _execute() done 15500 1727096216.61064: dumping result to json 15500 1727096216.61066: done dumping result, returning 15500 1727096216.61076: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000235] 15500 1727096216.61080: sending task result for task 0afff68d-5257-877d-2da0-000000000235 15500 1727096216.61184: done sending task result for task 0afff68d-5257-877d-2da0-000000000235 15500 1727096216.61186: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1727096213.8800035, "block_size": 4096, "blocks": 0, "ctime": 1727096213.8800035, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27720, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1727096213.8800035, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15500 1727096216.61274: no more pending results, returning what we have 15500 1727096216.61278: results queue empty 15500 1727096216.61278: checking for any_errors_fatal 15500 1727096216.61280: done checking for any_errors_fatal 15500 1727096216.61281: checking for max_fail_percentage 15500 1727096216.61282: done checking for max_fail_percentage 15500 1727096216.61283: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.61284: done checking to see if all hosts have failed 15500 1727096216.61285: getting the remaining hosts for this loop 15500 1727096216.61286: done getting the remaining hosts for this loop 15500 1727096216.61290: getting the next task for host managed_node1 15500 1727096216.61301: done getting next task for host managed_node1 15500 1727096216.61305: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15500 1727096216.61307: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.61312: getting variables 15500 1727096216.61314: in VariableManager get_vars() 15500 1727096216.61340: Calling all_inventory to load vars for managed_node1 15500 1727096216.61343: Calling groups_inventory to load vars for managed_node1 15500 1727096216.61346: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.61355: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.61360: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.61362: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.62249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.63620: done with get_vars() 15500 1727096216.63639: done getting variables 15500 1727096216.63688: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096216.63777: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:56:56 -0400 (0:00:00.401) 0:00:16.681 ****** 15500 1727096216.63800: entering _queue_task() for managed_node1/assert 15500 1727096216.64062: worker is 1 (out of 1 available) 15500 1727096216.64076: exiting _queue_task() for managed_node1/assert 15500 1727096216.64088: done queuing things up, now waiting for results queue to drain 15500 1727096216.64089: waiting for pending results... 15500 1727096216.64255: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' 15500 1727096216.64324: in run() - task 0afff68d-5257-877d-2da0-00000000022b 15500 1727096216.64336: variable 'ansible_search_path' from source: unknown 15500 1727096216.64340: variable 'ansible_search_path' from source: unknown 15500 1727096216.64370: calling self._execute() 15500 1727096216.64437: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.64442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.64452: variable 'omit' from source: magic vars 15500 1727096216.64719: variable 'ansible_distribution_major_version' from source: facts 15500 1727096216.64729: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096216.64735: variable 'omit' from source: magic vars 15500 1727096216.64767: variable 'omit' from source: magic vars 15500 1727096216.64836: variable 'interface' from source: set_fact 15500 1727096216.64851: variable 'omit' from source: magic vars 15500 1727096216.64890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096216.64917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096216.64934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096216.64947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096216.64956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096216.64985: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096216.64988: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.64991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.65062: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096216.65065: Set connection var ansible_pipelining to False 15500 1727096216.65080: Set connection var ansible_timeout to 10 15500 1727096216.65083: Set connection var ansible_shell_type to sh 15500 1727096216.65086: Set connection var ansible_shell_executable to /bin/sh 15500 1727096216.65088: Set connection var ansible_connection to ssh 15500 1727096216.65105: variable 'ansible_shell_executable' from source: unknown 15500 1727096216.65108: variable 'ansible_connection' from source: unknown 15500 1727096216.65111: variable 'ansible_module_compression' from source: unknown 15500 1727096216.65113: variable 'ansible_shell_type' from source: unknown 15500 1727096216.65116: variable 'ansible_shell_executable' from source: unknown 15500 1727096216.65118: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.65120: variable 'ansible_pipelining' from source: unknown 15500 1727096216.65123: variable 'ansible_timeout' from source: unknown 15500 1727096216.65128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.65232: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096216.65241: variable 'omit' from source: magic vars 15500 1727096216.65246: starting attempt loop 15500 1727096216.65249: running the handler 15500 1727096216.65355: variable 'interface_stat' from source: set_fact 15500 1727096216.65371: Evaluated conditional (interface_stat.stat.exists): True 15500 1727096216.65377: handler run complete 15500 1727096216.65393: attempt loop complete, returning result 15500 1727096216.65402: _execute() done 15500 1727096216.65405: dumping result to json 15500 1727096216.65407: done dumping result, returning 15500 1727096216.65409: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' [0afff68d-5257-877d-2da0-00000000022b] 15500 1727096216.65412: sending task result for task 0afff68d-5257-877d-2da0-00000000022b 15500 1727096216.65735: done sending task result for task 0afff68d-5257-877d-2da0-00000000022b 15500 1727096216.65738: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15500 1727096216.65795: no more pending results, returning what we have 15500 1727096216.65798: results queue empty 15500 1727096216.65799: checking for any_errors_fatal 15500 1727096216.65807: done checking for any_errors_fatal 15500 1727096216.65808: checking for max_fail_percentage 15500 1727096216.65810: done checking for max_fail_percentage 15500 1727096216.65810: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.65811: done checking to see if all hosts have failed 15500 1727096216.65812: getting the remaining hosts for this loop 15500 1727096216.65814: done getting the remaining hosts for this loop 15500 1727096216.65817: getting the next task for host managed_node1 15500 1727096216.65824: done getting next task for host managed_node1 15500 1727096216.65826: ^ task is: TASK: meta (flush_handlers) 15500 1727096216.65828: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.65831: getting variables 15500 1727096216.65834: in VariableManager get_vars() 15500 1727096216.65873: Calling all_inventory to load vars for managed_node1 15500 1727096216.65876: Calling groups_inventory to load vars for managed_node1 15500 1727096216.65880: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.65891: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.65894: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.65897: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.67499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.69262: done with get_vars() 15500 1727096216.69294: done getting variables 15500 1727096216.69382: in VariableManager get_vars() 15500 1727096216.69393: Calling all_inventory to load vars for managed_node1 15500 1727096216.69395: Calling groups_inventory to load vars for managed_node1 15500 1727096216.69398: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.69403: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.69405: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.69408: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.70818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.72512: done with get_vars() 15500 1727096216.72547: done queuing things up, now waiting for results queue to drain 15500 1727096216.72550: results queue empty 15500 1727096216.72550: checking for any_errors_fatal 15500 1727096216.72553: done checking for any_errors_fatal 15500 1727096216.72554: checking for max_fail_percentage 15500 1727096216.72555: done checking for max_fail_percentage 15500 1727096216.72556: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.72559: done checking to see if all hosts have failed 15500 1727096216.72566: getting the remaining hosts for this loop 15500 1727096216.72569: done getting the remaining hosts for this loop 15500 1727096216.72572: getting the next task for host managed_node1 15500 1727096216.72576: done getting next task for host managed_node1 15500 1727096216.72578: ^ task is: TASK: meta (flush_handlers) 15500 1727096216.72579: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.72582: getting variables 15500 1727096216.72583: in VariableManager get_vars() 15500 1727096216.72599: Calling all_inventory to load vars for managed_node1 15500 1727096216.72602: Calling groups_inventory to load vars for managed_node1 15500 1727096216.72604: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.72610: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.72612: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.72615: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.73878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.75627: done with get_vars() 15500 1727096216.75655: done getting variables 15500 1727096216.75710: in VariableManager get_vars() 15500 1727096216.75719: Calling all_inventory to load vars for managed_node1 15500 1727096216.75721: Calling groups_inventory to load vars for managed_node1 15500 1727096216.75723: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.75728: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.75730: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.75733: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.76933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.78586: done with get_vars() 15500 1727096216.78616: done queuing things up, now waiting for results queue to drain 15500 1727096216.78618: results queue empty 15500 1727096216.78619: checking for any_errors_fatal 15500 1727096216.78621: done checking for any_errors_fatal 15500 1727096216.78621: checking for max_fail_percentage 15500 1727096216.78622: done checking for max_fail_percentage 15500 1727096216.78623: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.78624: done checking to see if all hosts have failed 15500 1727096216.78624: getting the remaining hosts for this loop 15500 1727096216.78625: done getting the remaining hosts for this loop 15500 1727096216.78628: getting the next task for host managed_node1 15500 1727096216.78631: done getting next task for host managed_node1 15500 1727096216.78632: ^ task is: None 15500 1727096216.78639: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.78641: done queuing things up, now waiting for results queue to drain 15500 1727096216.78642: results queue empty 15500 1727096216.78642: checking for any_errors_fatal 15500 1727096216.78643: done checking for any_errors_fatal 15500 1727096216.78644: checking for max_fail_percentage 15500 1727096216.78645: done checking for max_fail_percentage 15500 1727096216.78645: checking to see if all hosts have failed and the running result is not ok 15500 1727096216.78646: done checking to see if all hosts have failed 15500 1727096216.78647: getting the next task for host managed_node1 15500 1727096216.78650: done getting next task for host managed_node1 15500 1727096216.78650: ^ task is: None 15500 1727096216.78652: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.78695: in VariableManager get_vars() 15500 1727096216.78712: done with get_vars() 15500 1727096216.78718: in VariableManager get_vars() 15500 1727096216.78728: done with get_vars() 15500 1727096216.78732: variable 'omit' from source: magic vars 15500 1727096216.78859: variable 'task' from source: play vars 15500 1727096216.78892: in VariableManager get_vars() 15500 1727096216.78903: done with get_vars() 15500 1727096216.78922: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 15500 1727096216.79151: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096216.79183: getting the remaining hosts for this loop 15500 1727096216.79184: done getting the remaining hosts for this loop 15500 1727096216.79187: getting the next task for host managed_node1 15500 1727096216.79189: done getting next task for host managed_node1 15500 1727096216.79191: ^ task is: TASK: Gathering Facts 15500 1727096216.79192: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096216.79194: getting variables 15500 1727096216.79195: in VariableManager get_vars() 15500 1727096216.79202: Calling all_inventory to load vars for managed_node1 15500 1727096216.79204: Calling groups_inventory to load vars for managed_node1 15500 1727096216.79206: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096216.79211: Calling all_plugins_play to load vars for managed_node1 15500 1727096216.79213: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096216.79216: Calling groups_plugins_play to load vars for managed_node1 15500 1727096216.80566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096216.82193: done with get_vars() 15500 1727096216.82215: done getting variables 15500 1727096216.82271: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Monday 23 September 2024 08:56:56 -0400 (0:00:00.184) 0:00:16.866 ****** 15500 1727096216.82298: entering _queue_task() for managed_node1/gather_facts 15500 1727096216.82659: worker is 1 (out of 1 available) 15500 1727096216.82674: exiting _queue_task() for managed_node1/gather_facts 15500 1727096216.82686: done queuing things up, now waiting for results queue to drain 15500 1727096216.82688: waiting for pending results... 15500 1727096216.82904: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096216.83076: in run() - task 0afff68d-5257-877d-2da0-00000000024e 15500 1727096216.83081: variable 'ansible_search_path' from source: unknown 15500 1727096216.83083: calling self._execute() 15500 1727096216.83160: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.83180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.83196: variable 'omit' from source: magic vars 15500 1727096216.83578: variable 'ansible_distribution_major_version' from source: facts 15500 1727096216.83595: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096216.83608: variable 'omit' from source: magic vars 15500 1727096216.83872: variable 'omit' from source: magic vars 15500 1727096216.83875: variable 'omit' from source: magic vars 15500 1727096216.83878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096216.83880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096216.83883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096216.83885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096216.83888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096216.83890: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096216.83892: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.83894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.83970: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096216.83981: Set connection var ansible_pipelining to False 15500 1727096216.83991: Set connection var ansible_timeout to 10 15500 1727096216.83998: Set connection var ansible_shell_type to sh 15500 1727096216.84012: Set connection var ansible_shell_executable to /bin/sh 15500 1727096216.84022: Set connection var ansible_connection to ssh 15500 1727096216.84048: variable 'ansible_shell_executable' from source: unknown 15500 1727096216.84055: variable 'ansible_connection' from source: unknown 15500 1727096216.84062: variable 'ansible_module_compression' from source: unknown 15500 1727096216.84072: variable 'ansible_shell_type' from source: unknown 15500 1727096216.84081: variable 'ansible_shell_executable' from source: unknown 15500 1727096216.84088: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096216.84096: variable 'ansible_pipelining' from source: unknown 15500 1727096216.84103: variable 'ansible_timeout' from source: unknown 15500 1727096216.84116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096216.84298: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096216.84315: variable 'omit' from source: magic vars 15500 1727096216.84324: starting attempt loop 15500 1727096216.84335: running the handler 15500 1727096216.84355: variable 'ansible_facts' from source: unknown 15500 1727096216.84380: _low_level_execute_command(): starting 15500 1727096216.84393: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096216.85190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.85221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.85241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.85254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.85362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.87089: stdout chunk (state=3): >>>/root <<< 15500 1727096216.87250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.87254: stdout chunk (state=3): >>><<< 15500 1727096216.87257: stderr chunk (state=3): >>><<< 15500 1727096216.87375: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096216.87379: _low_level_execute_command(): starting 15500 1727096216.87382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982 `" && echo ansible-tmp-1727096216.8728657-16232-21845519995982="` echo /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982 `" ) && sleep 0' 15500 1727096216.87979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096216.88042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.88127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.88152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.88266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.90270: stdout chunk (state=3): >>>ansible-tmp-1727096216.8728657-16232-21845519995982=/root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982 <<< 15500 1727096216.90372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.90421: stderr chunk (state=3): >>><<< 15500 1727096216.90424: stdout chunk (state=3): >>><<< 15500 1727096216.90442: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096216.8728657-16232-21845519995982=/root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096216.90573: variable 'ansible_module_compression' from source: unknown 15500 1727096216.90578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096216.90605: variable 'ansible_facts' from source: unknown 15500 1727096216.90800: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/AnsiballZ_setup.py 15500 1727096216.90994: Sending initial data 15500 1727096216.90997: Sent initial data (153 bytes) 15500 1727096216.91574: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096216.91589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096216.91605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096216.91683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.91724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.91744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.91758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.91863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.93478: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096216.93534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096216.93601: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpugohahu0 /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/AnsiballZ_setup.py <<< 15500 1727096216.93604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/AnsiballZ_setup.py" <<< 15500 1727096216.93669: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpugohahu0" to remote "/root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/AnsiballZ_setup.py" <<< 15500 1727096216.93672: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/AnsiballZ_setup.py" <<< 15500 1727096216.94817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.94866: stderr chunk (state=3): >>><<< 15500 1727096216.94872: stdout chunk (state=3): >>><<< 15500 1727096216.94915: done transferring module to remote 15500 1727096216.94918: _low_level_execute_command(): starting 15500 1727096216.94920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/ /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/AnsiballZ_setup.py && sleep 0' 15500 1727096216.95677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096216.95702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.95789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096216.97629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096216.97653: stderr chunk (state=3): >>><<< 15500 1727096216.97662: stdout chunk (state=3): >>><<< 15500 1727096216.97735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096216.97738: _low_level_execute_command(): starting 15500 1727096216.97741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/AnsiballZ_setup.py && sleep 0' 15500 1727096216.98119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096216.98123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.98125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096216.98127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096216.98182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096216.98186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096216.98264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096217.62701: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.4541015625, "5m": 0.32470703125, "15m": 0.1513671875}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "57", "epoch": "1727096217", "epoch_int": "1727096217", "date": "2024-09-23", "time": "08:56:57", "iso8601_micro": "2024-09-23T12:56:57.263098Z", "iso8601": "2024-09-23T12:56:57Z", "iso8601_basic": "20240923T085657263098", "iso8601_basic_short": "20240923T085657", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 370, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797597184, "block_size": 4096, "block_total": 65519099, "block_available": 63915429, "block_used": 1603670, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1a:3f:d0:99:f4:7d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096217.64598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096217.64622: stderr chunk (state=3): >>>Shared connection to 10.31.11.125 closed. <<< 15500 1727096217.64708: stderr chunk (state=3): >>><<< 15500 1727096217.64721: stdout chunk (state=3): >>><<< 15500 1727096217.64816: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.4541015625, "5m": 0.32470703125, "15m": 0.1513671875}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "56", "second": "57", "epoch": "1727096217", "epoch_int": "1727096217", "date": "2024-09-23", "time": "08:56:57", "iso8601_micro": "2024-09-23T12:56:57.263098Z", "iso8601": "2024-09-23T12:56:57Z", "iso8601_basic": "20240923T085657263098", "iso8601_basic_short": "20240923T085657", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 370, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797597184, "block_size": 4096, "block_total": 65519099, "block_available": 63915429, "block_used": 1603670, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1a:3f:d0:99:f4:7d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096217.65212: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096217.65229: _low_level_execute_command(): starting 15500 1727096217.65239: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096216.8728657-16232-21845519995982/ > /dev/null 2>&1 && sleep 0' 15500 1727096217.65877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096217.65891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096217.65981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096217.66001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096217.66022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096217.66126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096217.68001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096217.68005: stdout chunk (state=3): >>><<< 15500 1727096217.68173: stderr chunk (state=3): >>><<< 15500 1727096217.68176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096217.68179: handler run complete 15500 1727096217.68185: variable 'ansible_facts' from source: unknown 15500 1727096217.68289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.68650: variable 'ansible_facts' from source: unknown 15500 1727096217.68752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.68908: attempt loop complete, returning result 15500 1727096217.68924: _execute() done 15500 1727096217.68931: dumping result to json 15500 1727096217.68972: done dumping result, returning 15500 1727096217.68986: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-00000000024e] 15500 1727096217.68995: sending task result for task 0afff68d-5257-877d-2da0-00000000024e ok: [managed_node1] 15500 1727096217.70218: no more pending results, returning what we have 15500 1727096217.70222: results queue empty 15500 1727096217.70223: checking for any_errors_fatal 15500 1727096217.70225: done checking for any_errors_fatal 15500 1727096217.70226: checking for max_fail_percentage 15500 1727096217.70227: done checking for max_fail_percentage 15500 1727096217.70228: checking to see if all hosts have failed and the running result is not ok 15500 1727096217.70229: done checking to see if all hosts have failed 15500 1727096217.70230: getting the remaining hosts for this loop 15500 1727096217.70233: done getting the remaining hosts for this loop 15500 1727096217.70238: getting the next task for host managed_node1 15500 1727096217.70244: done getting next task for host managed_node1 15500 1727096217.70246: ^ task is: TASK: meta (flush_handlers) 15500 1727096217.70248: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096217.70252: getting variables 15500 1727096217.70253: in VariableManager get_vars() 15500 1727096217.70571: done sending task result for task 0afff68d-5257-877d-2da0-00000000024e 15500 1727096217.70575: WORKER PROCESS EXITING 15500 1727096217.70591: Calling all_inventory to load vars for managed_node1 15500 1727096217.70608: Calling groups_inventory to load vars for managed_node1 15500 1727096217.70611: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096217.70622: Calling all_plugins_play to load vars for managed_node1 15500 1727096217.70624: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096217.70628: Calling groups_plugins_play to load vars for managed_node1 15500 1727096217.73798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.76028: done with get_vars() 15500 1727096217.76051: done getting variables 15500 1727096217.76121: in VariableManager get_vars() 15500 1727096217.76136: Calling all_inventory to load vars for managed_node1 15500 1727096217.76138: Calling groups_inventory to load vars for managed_node1 15500 1727096217.76140: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096217.76146: Calling all_plugins_play to load vars for managed_node1 15500 1727096217.76148: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096217.76151: Calling groups_plugins_play to load vars for managed_node1 15500 1727096217.77345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.80524: done with get_vars() 15500 1727096217.80556: done queuing things up, now waiting for results queue to drain 15500 1727096217.80559: results queue empty 15500 1727096217.80559: checking for any_errors_fatal 15500 1727096217.80564: done checking for any_errors_fatal 15500 1727096217.80564: checking for max_fail_percentage 15500 1727096217.80565: done checking for max_fail_percentage 15500 1727096217.80566: checking to see if all hosts have failed and the running result is not ok 15500 1727096217.80573: done checking to see if all hosts have failed 15500 1727096217.80574: getting the remaining hosts for this loop 15500 1727096217.80574: done getting the remaining hosts for this loop 15500 1727096217.80578: getting the next task for host managed_node1 15500 1727096217.80582: done getting next task for host managed_node1 15500 1727096217.80586: ^ task is: TASK: Include the task '{{ task }}' 15500 1727096217.80588: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096217.80590: getting variables 15500 1727096217.80591: in VariableManager get_vars() 15500 1727096217.80607: Calling all_inventory to load vars for managed_node1 15500 1727096217.80609: Calling groups_inventory to load vars for managed_node1 15500 1727096217.80612: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096217.80619: Calling all_plugins_play to load vars for managed_node1 15500 1727096217.80621: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096217.80624: Calling groups_plugins_play to load vars for managed_node1 15500 1727096217.82070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.83673: done with get_vars() 15500 1727096217.83691: done getting variables 15500 1727096217.83820: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Monday 23 September 2024 08:56:57 -0400 (0:00:01.015) 0:00:17.881 ****** 15500 1727096217.83844: entering _queue_task() for managed_node1/include_tasks 15500 1727096217.84123: worker is 1 (out of 1 available) 15500 1727096217.84137: exiting _queue_task() for managed_node1/include_tasks 15500 1727096217.84148: done queuing things up, now waiting for results queue to drain 15500 1727096217.84149: waiting for pending results... 15500 1727096217.84359: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' 15500 1727096217.84476: in run() - task 0afff68d-5257-877d-2da0-000000000031 15500 1727096217.84516: variable 'ansible_search_path' from source: unknown 15500 1727096217.84563: calling self._execute() 15500 1727096217.84621: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096217.84625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096217.84636: variable 'omit' from source: magic vars 15500 1727096217.84992: variable 'ansible_distribution_major_version' from source: facts 15500 1727096217.84996: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096217.85006: variable 'task' from source: play vars 15500 1727096217.85272: variable 'task' from source: play vars 15500 1727096217.85275: _execute() done 15500 1727096217.85278: dumping result to json 15500 1727096217.85280: done dumping result, returning 15500 1727096217.85283: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' [0afff68d-5257-877d-2da0-000000000031] 15500 1727096217.85285: sending task result for task 0afff68d-5257-877d-2da0-000000000031 15500 1727096217.85356: done sending task result for task 0afff68d-5257-877d-2da0-000000000031 15500 1727096217.85359: WORKER PROCESS EXITING 15500 1727096217.85419: no more pending results, returning what we have 15500 1727096217.85427: in VariableManager get_vars() 15500 1727096217.85477: Calling all_inventory to load vars for managed_node1 15500 1727096217.85480: Calling groups_inventory to load vars for managed_node1 15500 1727096217.85483: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096217.85500: Calling all_plugins_play to load vars for managed_node1 15500 1727096217.85503: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096217.85507: Calling groups_plugins_play to load vars for managed_node1 15500 1727096217.86940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.88476: done with get_vars() 15500 1727096217.88500: variable 'ansible_search_path' from source: unknown 15500 1727096217.88513: we have included files to process 15500 1727096217.88514: generating all_blocks data 15500 1727096217.88515: done generating all_blocks data 15500 1727096217.88515: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15500 1727096217.88516: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15500 1727096217.88518: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15500 1727096217.88705: in VariableManager get_vars() 15500 1727096217.88721: done with get_vars() 15500 1727096217.88972: done processing included file 15500 1727096217.88974: iterating over new_blocks loaded from include file 15500 1727096217.88975: in VariableManager get_vars() 15500 1727096217.88987: done with get_vars() 15500 1727096217.88989: filtering new block on tags 15500 1727096217.89008: done filtering new block on tags 15500 1727096217.89010: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 15500 1727096217.89016: extending task lists for all hosts with included blocks 15500 1727096217.89048: done extending task lists 15500 1727096217.89049: done processing included files 15500 1727096217.89050: results queue empty 15500 1727096217.89050: checking for any_errors_fatal 15500 1727096217.89052: done checking for any_errors_fatal 15500 1727096217.89053: checking for max_fail_percentage 15500 1727096217.89054: done checking for max_fail_percentage 15500 1727096217.89054: checking to see if all hosts have failed and the running result is not ok 15500 1727096217.89055: done checking to see if all hosts have failed 15500 1727096217.89056: getting the remaining hosts for this loop 15500 1727096217.89057: done getting the remaining hosts for this loop 15500 1727096217.89059: getting the next task for host managed_node1 15500 1727096217.89063: done getting next task for host managed_node1 15500 1727096217.89065: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15500 1727096217.89069: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096217.89071: getting variables 15500 1727096217.89072: in VariableManager get_vars() 15500 1727096217.89080: Calling all_inventory to load vars for managed_node1 15500 1727096217.89082: Calling groups_inventory to load vars for managed_node1 15500 1727096217.89084: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096217.89089: Calling all_plugins_play to load vars for managed_node1 15500 1727096217.89092: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096217.89095: Calling groups_plugins_play to load vars for managed_node1 15500 1727096217.90385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.92000: done with get_vars() 15500 1727096217.92031: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:56:57 -0400 (0:00:00.082) 0:00:17.964 ****** 15500 1727096217.92127: entering _queue_task() for managed_node1/include_tasks 15500 1727096217.92526: worker is 1 (out of 1 available) 15500 1727096217.92539: exiting _queue_task() for managed_node1/include_tasks 15500 1727096217.92551: done queuing things up, now waiting for results queue to drain 15500 1727096217.92552: waiting for pending results... 15500 1727096217.92985: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 15500 1727096217.92989: in run() - task 0afff68d-5257-877d-2da0-00000000025f 15500 1727096217.92992: variable 'ansible_search_path' from source: unknown 15500 1727096217.92995: variable 'ansible_search_path' from source: unknown 15500 1727096217.93034: calling self._execute() 15500 1727096217.93130: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096217.93142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096217.93157: variable 'omit' from source: magic vars 15500 1727096217.93586: variable 'ansible_distribution_major_version' from source: facts 15500 1727096217.93604: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096217.93615: _execute() done 15500 1727096217.93623: dumping result to json 15500 1727096217.93630: done dumping result, returning 15500 1727096217.93640: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-877d-2da0-00000000025f] 15500 1727096217.93650: sending task result for task 0afff68d-5257-877d-2da0-00000000025f 15500 1727096217.93789: no more pending results, returning what we have 15500 1727096217.93795: in VariableManager get_vars() 15500 1727096217.93830: Calling all_inventory to load vars for managed_node1 15500 1727096217.93835: Calling groups_inventory to load vars for managed_node1 15500 1727096217.93838: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096217.93853: Calling all_plugins_play to load vars for managed_node1 15500 1727096217.93859: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096217.93863: Calling groups_plugins_play to load vars for managed_node1 15500 1727096217.94581: done sending task result for task 0afff68d-5257-877d-2da0-00000000025f 15500 1727096217.94584: WORKER PROCESS EXITING 15500 1727096217.95504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096217.97230: done with get_vars() 15500 1727096217.97249: variable 'ansible_search_path' from source: unknown 15500 1727096217.97251: variable 'ansible_search_path' from source: unknown 15500 1727096217.97263: variable 'task' from source: play vars 15500 1727096217.97389: variable 'task' from source: play vars 15500 1727096217.97438: we have included files to process 15500 1727096217.97439: generating all_blocks data 15500 1727096217.97441: done generating all_blocks data 15500 1727096217.97442: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15500 1727096217.97443: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15500 1727096217.97446: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15500 1727096217.98619: done processing included file 15500 1727096217.98621: iterating over new_blocks loaded from include file 15500 1727096217.98622: in VariableManager get_vars() 15500 1727096217.98640: done with get_vars() 15500 1727096217.98642: filtering new block on tags 15500 1727096217.98666: done filtering new block on tags 15500 1727096217.98673: in VariableManager get_vars() 15500 1727096217.98688: done with get_vars() 15500 1727096217.98690: filtering new block on tags 15500 1727096217.98712: done filtering new block on tags 15500 1727096217.98714: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 15500 1727096217.98724: extending task lists for all hosts with included blocks 15500 1727096217.98918: done extending task lists 15500 1727096217.98920: done processing included files 15500 1727096217.98920: results queue empty 15500 1727096217.98921: checking for any_errors_fatal 15500 1727096217.98925: done checking for any_errors_fatal 15500 1727096217.98926: checking for max_fail_percentage 15500 1727096217.98927: done checking for max_fail_percentage 15500 1727096217.98927: checking to see if all hosts have failed and the running result is not ok 15500 1727096217.98928: done checking to see if all hosts have failed 15500 1727096217.98929: getting the remaining hosts for this loop 15500 1727096217.98930: done getting the remaining hosts for this loop 15500 1727096217.98933: getting the next task for host managed_node1 15500 1727096217.98942: done getting next task for host managed_node1 15500 1727096217.98945: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15500 1727096217.98951: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096217.98954: getting variables 15500 1727096217.98955: in VariableManager get_vars() 15500 1727096218.02959: Calling all_inventory to load vars for managed_node1 15500 1727096218.02962: Calling groups_inventory to load vars for managed_node1 15500 1727096218.02965: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096218.02973: Calling all_plugins_play to load vars for managed_node1 15500 1727096218.02975: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096218.02978: Calling groups_plugins_play to load vars for managed_node1 15500 1727096218.04078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096218.05607: done with get_vars() 15500 1727096218.05632: done getting variables 15500 1727096218.05684: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:56:58 -0400 (0:00:00.135) 0:00:18.100 ****** 15500 1727096218.05711: entering _queue_task() for managed_node1/set_fact 15500 1727096218.06295: worker is 1 (out of 1 available) 15500 1727096218.06302: exiting _queue_task() for managed_node1/set_fact 15500 1727096218.06314: done queuing things up, now waiting for results queue to drain 15500 1727096218.06316: waiting for pending results... 15500 1727096218.06371: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15500 1727096218.06517: in run() - task 0afff68d-5257-877d-2da0-00000000026c 15500 1727096218.06541: variable 'ansible_search_path' from source: unknown 15500 1727096218.06549: variable 'ansible_search_path' from source: unknown 15500 1727096218.06592: calling self._execute() 15500 1727096218.06692: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.06705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.06720: variable 'omit' from source: magic vars 15500 1727096218.07105: variable 'ansible_distribution_major_version' from source: facts 15500 1727096218.07121: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096218.07130: variable 'omit' from source: magic vars 15500 1727096218.07182: variable 'omit' from source: magic vars 15500 1727096218.07227: variable 'omit' from source: magic vars 15500 1727096218.07274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096218.07377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096218.07381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096218.07385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096218.07389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096218.07430: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096218.07441: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.07451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.07573: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096218.07586: Set connection var ansible_pipelining to False 15500 1727096218.07597: Set connection var ansible_timeout to 10 15500 1727096218.07605: Set connection var ansible_shell_type to sh 15500 1727096218.07625: Set connection var ansible_shell_executable to /bin/sh 15500 1727096218.07628: Set connection var ansible_connection to ssh 15500 1727096218.07735: variable 'ansible_shell_executable' from source: unknown 15500 1727096218.07738: variable 'ansible_connection' from source: unknown 15500 1727096218.07741: variable 'ansible_module_compression' from source: unknown 15500 1727096218.07744: variable 'ansible_shell_type' from source: unknown 15500 1727096218.07746: variable 'ansible_shell_executable' from source: unknown 15500 1727096218.07748: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.07751: variable 'ansible_pipelining' from source: unknown 15500 1727096218.07753: variable 'ansible_timeout' from source: unknown 15500 1727096218.07755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.07866: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096218.07885: variable 'omit' from source: magic vars 15500 1727096218.07896: starting attempt loop 15500 1727096218.07902: running the handler 15500 1727096218.07922: handler run complete 15500 1727096218.07937: attempt loop complete, returning result 15500 1727096218.07947: _execute() done 15500 1727096218.07959: dumping result to json 15500 1727096218.07970: done dumping result, returning 15500 1727096218.08065: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-877d-2da0-00000000026c] 15500 1727096218.08069: sending task result for task 0afff68d-5257-877d-2da0-00000000026c 15500 1727096218.08137: done sending task result for task 0afff68d-5257-877d-2da0-00000000026c 15500 1727096218.08140: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15500 1727096218.08225: no more pending results, returning what we have 15500 1727096218.08228: results queue empty 15500 1727096218.08229: checking for any_errors_fatal 15500 1727096218.08230: done checking for any_errors_fatal 15500 1727096218.08231: checking for max_fail_percentage 15500 1727096218.08233: done checking for max_fail_percentage 15500 1727096218.08234: checking to see if all hosts have failed and the running result is not ok 15500 1727096218.08235: done checking to see if all hosts have failed 15500 1727096218.08236: getting the remaining hosts for this loop 15500 1727096218.08237: done getting the remaining hosts for this loop 15500 1727096218.08241: getting the next task for host managed_node1 15500 1727096218.08250: done getting next task for host managed_node1 15500 1727096218.08253: ^ task is: TASK: Stat profile file 15500 1727096218.08259: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096218.08265: getting variables 15500 1727096218.08266: in VariableManager get_vars() 15500 1727096218.08301: Calling all_inventory to load vars for managed_node1 15500 1727096218.08304: Calling groups_inventory to load vars for managed_node1 15500 1727096218.08307: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096218.08320: Calling all_plugins_play to load vars for managed_node1 15500 1727096218.08322: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096218.08325: Calling groups_plugins_play to load vars for managed_node1 15500 1727096218.10033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096218.11632: done with get_vars() 15500 1727096218.11659: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:56:58 -0400 (0:00:00.060) 0:00:18.160 ****** 15500 1727096218.11754: entering _queue_task() for managed_node1/stat 15500 1727096218.12102: worker is 1 (out of 1 available) 15500 1727096218.12114: exiting _queue_task() for managed_node1/stat 15500 1727096218.12127: done queuing things up, now waiting for results queue to drain 15500 1727096218.12129: waiting for pending results... 15500 1727096218.12586: running TaskExecutor() for managed_node1/TASK: Stat profile file 15500 1727096218.12591: in run() - task 0afff68d-5257-877d-2da0-00000000026d 15500 1727096218.12594: variable 'ansible_search_path' from source: unknown 15500 1727096218.12596: variable 'ansible_search_path' from source: unknown 15500 1727096218.12620: calling self._execute() 15500 1727096218.12718: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.12730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.12745: variable 'omit' from source: magic vars 15500 1727096218.13126: variable 'ansible_distribution_major_version' from source: facts 15500 1727096218.13146: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096218.13161: variable 'omit' from source: magic vars 15500 1727096218.13210: variable 'omit' from source: magic vars 15500 1727096218.13313: variable 'profile' from source: play vars 15500 1727096218.13324: variable 'interface' from source: set_fact 15500 1727096218.13396: variable 'interface' from source: set_fact 15500 1727096218.13419: variable 'omit' from source: magic vars 15500 1727096218.13469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096218.13510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096218.13535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096218.13585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096218.13589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096218.13612: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096218.13621: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.13629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.13736: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096218.13801: Set connection var ansible_pipelining to False 15500 1727096218.13804: Set connection var ansible_timeout to 10 15500 1727096218.13806: Set connection var ansible_shell_type to sh 15500 1727096218.13809: Set connection var ansible_shell_executable to /bin/sh 15500 1727096218.13811: Set connection var ansible_connection to ssh 15500 1727096218.13813: variable 'ansible_shell_executable' from source: unknown 15500 1727096218.13815: variable 'ansible_connection' from source: unknown 15500 1727096218.13817: variable 'ansible_module_compression' from source: unknown 15500 1727096218.13818: variable 'ansible_shell_type' from source: unknown 15500 1727096218.13820: variable 'ansible_shell_executable' from source: unknown 15500 1727096218.13822: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.13830: variable 'ansible_pipelining' from source: unknown 15500 1727096218.13838: variable 'ansible_timeout' from source: unknown 15500 1727096218.13846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.14050: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096218.14070: variable 'omit' from source: magic vars 15500 1727096218.14128: starting attempt loop 15500 1727096218.14131: running the handler 15500 1727096218.14133: _low_level_execute_command(): starting 15500 1727096218.14135: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096218.14829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.14842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.14855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.14898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.14997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.15016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.15156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.16880: stdout chunk (state=3): >>>/root <<< 15500 1727096218.16976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.17006: stderr chunk (state=3): >>><<< 15500 1727096218.17010: stdout chunk (state=3): >>><<< 15500 1727096218.17031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.17044: _low_level_execute_command(): starting 15500 1727096218.17054: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055 `" && echo ansible-tmp-1727096218.170313-16269-197019726655055="` echo /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055 `" ) && sleep 0' 15500 1727096218.17812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096218.17816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.17854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.17866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.17973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.19922: stdout chunk (state=3): >>>ansible-tmp-1727096218.170313-16269-197019726655055=/root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055 <<< 15500 1727096218.20466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.20472: stdout chunk (state=3): >>><<< 15500 1727096218.20475: stderr chunk (state=3): >>><<< 15500 1727096218.20477: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096218.170313-16269-197019726655055=/root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.20480: variable 'ansible_module_compression' from source: unknown 15500 1727096218.20482: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15500 1727096218.20484: variable 'ansible_facts' from source: unknown 15500 1727096218.20549: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/AnsiballZ_stat.py 15500 1727096218.20724: Sending initial data 15500 1727096218.20733: Sent initial data (152 bytes) 15500 1727096218.21327: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.21341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.21371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.21480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.21493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.21526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.21616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.23225: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096218.23289: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096218.23354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpc8bj7oam /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/AnsiballZ_stat.py <<< 15500 1727096218.23362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/AnsiballZ_stat.py" <<< 15500 1727096218.23418: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpc8bj7oam" to remote "/root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/AnsiballZ_stat.py" <<< 15500 1727096218.23424: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/AnsiballZ_stat.py" <<< 15500 1727096218.24039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.24084: stderr chunk (state=3): >>><<< 15500 1727096218.24088: stdout chunk (state=3): >>><<< 15500 1727096218.24110: done transferring module to remote 15500 1727096218.24122: _low_level_execute_command(): starting 15500 1727096218.24126: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/ /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/AnsiballZ_stat.py && sleep 0' 15500 1727096218.24536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.24540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096218.24564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096218.24571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.24575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.24583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.24642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.24647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.24649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.24716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.26680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.26684: stdout chunk (state=3): >>><<< 15500 1727096218.26686: stderr chunk (state=3): >>><<< 15500 1727096218.26716: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.26852: _low_level_execute_command(): starting 15500 1727096218.26855: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/AnsiballZ_stat.py && sleep 0' 15500 1727096218.27420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.27440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.27463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.27489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096218.27511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096218.27524: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096218.27588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.27643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.27670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.27720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.28002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.43295: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15500 1727096218.44943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096218.44947: stdout chunk (state=3): >>><<< 15500 1727096218.44949: stderr chunk (state=3): >>><<< 15500 1727096218.45111: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096218.45115: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096218.45118: _low_level_execute_command(): starting 15500 1727096218.45120: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096218.170313-16269-197019726655055/ > /dev/null 2>&1 && sleep 0' 15500 1727096218.45744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.45772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.45791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.45809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096218.45883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.45927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.45950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.45972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.46077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.48066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.48073: stdout chunk (state=3): >>><<< 15500 1727096218.48076: stderr chunk (state=3): >>><<< 15500 1727096218.48285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.48289: handler run complete 15500 1727096218.48292: attempt loop complete, returning result 15500 1727096218.48294: _execute() done 15500 1727096218.48297: dumping result to json 15500 1727096218.48299: done dumping result, returning 15500 1727096218.48301: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0afff68d-5257-877d-2da0-00000000026d] 15500 1727096218.48302: sending task result for task 0afff68d-5257-877d-2da0-00000000026d 15500 1727096218.48379: done sending task result for task 0afff68d-5257-877d-2da0-00000000026d 15500 1727096218.48383: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15500 1727096218.48450: no more pending results, returning what we have 15500 1727096218.48454: results queue empty 15500 1727096218.48455: checking for any_errors_fatal 15500 1727096218.48465: done checking for any_errors_fatal 15500 1727096218.48466: checking for max_fail_percentage 15500 1727096218.48473: done checking for max_fail_percentage 15500 1727096218.48474: checking to see if all hosts have failed and the running result is not ok 15500 1727096218.48475: done checking to see if all hosts have failed 15500 1727096218.48476: getting the remaining hosts for this loop 15500 1727096218.48478: done getting the remaining hosts for this loop 15500 1727096218.48482: getting the next task for host managed_node1 15500 1727096218.48489: done getting next task for host managed_node1 15500 1727096218.48494: ^ task is: TASK: Set NM profile exist flag based on the profile files 15500 1727096218.48499: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096218.48503: getting variables 15500 1727096218.48504: in VariableManager get_vars() 15500 1727096218.48535: Calling all_inventory to load vars for managed_node1 15500 1727096218.48538: Calling groups_inventory to load vars for managed_node1 15500 1727096218.48542: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096218.48554: Calling all_plugins_play to load vars for managed_node1 15500 1727096218.48559: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096218.48562: Calling groups_plugins_play to load vars for managed_node1 15500 1727096218.50622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096218.52238: done with get_vars() 15500 1727096218.52271: done getting variables 15500 1727096218.52332: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:56:58 -0400 (0:00:00.406) 0:00:18.566 ****** 15500 1727096218.52370: entering _queue_task() for managed_node1/set_fact 15500 1727096218.52813: worker is 1 (out of 1 available) 15500 1727096218.52826: exiting _queue_task() for managed_node1/set_fact 15500 1727096218.52837: done queuing things up, now waiting for results queue to drain 15500 1727096218.52839: waiting for pending results... 15500 1727096218.53235: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 15500 1727096218.53346: in run() - task 0afff68d-5257-877d-2da0-00000000026e 15500 1727096218.53352: variable 'ansible_search_path' from source: unknown 15500 1727096218.53355: variable 'ansible_search_path' from source: unknown 15500 1727096218.53361: calling self._execute() 15500 1727096218.53427: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.53439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.53459: variable 'omit' from source: magic vars 15500 1727096218.53836: variable 'ansible_distribution_major_version' from source: facts 15500 1727096218.53852: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096218.53982: variable 'profile_stat' from source: set_fact 15500 1727096218.54005: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096218.54012: when evaluation is False, skipping this task 15500 1727096218.54020: _execute() done 15500 1727096218.54109: dumping result to json 15500 1727096218.54112: done dumping result, returning 15500 1727096218.54115: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-877d-2da0-00000000026e] 15500 1727096218.54117: sending task result for task 0afff68d-5257-877d-2da0-00000000026e 15500 1727096218.54186: done sending task result for task 0afff68d-5257-877d-2da0-00000000026e 15500 1727096218.54190: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096218.54234: no more pending results, returning what we have 15500 1727096218.54238: results queue empty 15500 1727096218.54239: checking for any_errors_fatal 15500 1727096218.54248: done checking for any_errors_fatal 15500 1727096218.54249: checking for max_fail_percentage 15500 1727096218.54251: done checking for max_fail_percentage 15500 1727096218.54251: checking to see if all hosts have failed and the running result is not ok 15500 1727096218.54252: done checking to see if all hosts have failed 15500 1727096218.54253: getting the remaining hosts for this loop 15500 1727096218.54254: done getting the remaining hosts for this loop 15500 1727096218.54257: getting the next task for host managed_node1 15500 1727096218.54266: done getting next task for host managed_node1 15500 1727096218.54269: ^ task is: TASK: Get NM profile info 15500 1727096218.54273: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096218.54276: getting variables 15500 1727096218.54278: in VariableManager get_vars() 15500 1727096218.54306: Calling all_inventory to load vars for managed_node1 15500 1727096218.54308: Calling groups_inventory to load vars for managed_node1 15500 1727096218.54312: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096218.54323: Calling all_plugins_play to load vars for managed_node1 15500 1727096218.54326: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096218.54328: Calling groups_plugins_play to load vars for managed_node1 15500 1727096218.55959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096218.57621: done with get_vars() 15500 1727096218.57647: done getting variables 15500 1727096218.57753: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:56:58 -0400 (0:00:00.054) 0:00:18.621 ****** 15500 1727096218.57792: entering _queue_task() for managed_node1/shell 15500 1727096218.57799: Creating lock for shell 15500 1727096218.58173: worker is 1 (out of 1 available) 15500 1727096218.58187: exiting _queue_task() for managed_node1/shell 15500 1727096218.58201: done queuing things up, now waiting for results queue to drain 15500 1727096218.58203: waiting for pending results... 15500 1727096218.58485: running TaskExecutor() for managed_node1/TASK: Get NM profile info 15500 1727096218.58593: in run() - task 0afff68d-5257-877d-2da0-00000000026f 15500 1727096218.58614: variable 'ansible_search_path' from source: unknown 15500 1727096218.58773: variable 'ansible_search_path' from source: unknown 15500 1727096218.58777: calling self._execute() 15500 1727096218.58780: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.58784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.58787: variable 'omit' from source: magic vars 15500 1727096218.59142: variable 'ansible_distribution_major_version' from source: facts 15500 1727096218.59156: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096218.59166: variable 'omit' from source: magic vars 15500 1727096218.59216: variable 'omit' from source: magic vars 15500 1727096218.59318: variable 'profile' from source: play vars 15500 1727096218.59333: variable 'interface' from source: set_fact 15500 1727096218.59397: variable 'interface' from source: set_fact 15500 1727096218.59417: variable 'omit' from source: magic vars 15500 1727096218.59462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096218.59502: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096218.59528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096218.59555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096218.59576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096218.59611: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096218.59659: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.59662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.59739: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096218.59750: Set connection var ansible_pipelining to False 15500 1727096218.59765: Set connection var ansible_timeout to 10 15500 1727096218.59776: Set connection var ansible_shell_type to sh 15500 1727096218.59788: Set connection var ansible_shell_executable to /bin/sh 15500 1727096218.59799: Set connection var ansible_connection to ssh 15500 1727096218.59874: variable 'ansible_shell_executable' from source: unknown 15500 1727096218.59877: variable 'ansible_connection' from source: unknown 15500 1727096218.59880: variable 'ansible_module_compression' from source: unknown 15500 1727096218.59882: variable 'ansible_shell_type' from source: unknown 15500 1727096218.59884: variable 'ansible_shell_executable' from source: unknown 15500 1727096218.59885: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096218.59887: variable 'ansible_pipelining' from source: unknown 15500 1727096218.59890: variable 'ansible_timeout' from source: unknown 15500 1727096218.59893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096218.60018: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096218.60034: variable 'omit' from source: magic vars 15500 1727096218.60045: starting attempt loop 15500 1727096218.60052: running the handler 15500 1727096218.60096: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096218.60100: _low_level_execute_command(): starting 15500 1727096218.60110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096218.60821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.60837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.60851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.60875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096218.60900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096218.60984: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.61009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.61024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.61051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.61159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.62870: stdout chunk (state=3): >>>/root <<< 15500 1727096218.62983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.63014: stderr chunk (state=3): >>><<< 15500 1727096218.63027: stdout chunk (state=3): >>><<< 15500 1727096218.63074: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.63084: _low_level_execute_command(): starting 15500 1727096218.63163: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549 `" && echo ansible-tmp-1727096218.6305804-16295-58172774534549="` echo /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549 `" ) && sleep 0' 15500 1727096218.63699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.63714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.63733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.63783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.63835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.63847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.63872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.63970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.65902: stdout chunk (state=3): >>>ansible-tmp-1727096218.6305804-16295-58172774534549=/root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549 <<< 15500 1727096218.66007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.66042: stderr chunk (state=3): >>><<< 15500 1727096218.66062: stdout chunk (state=3): >>><<< 15500 1727096218.66273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096218.6305804-16295-58172774534549=/root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.66277: variable 'ansible_module_compression' from source: unknown 15500 1727096218.66280: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15500 1727096218.66282: variable 'ansible_facts' from source: unknown 15500 1727096218.66313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/AnsiballZ_command.py 15500 1727096218.66496: Sending initial data 15500 1727096218.66506: Sent initial data (155 bytes) 15500 1727096218.67081: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.67096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.67111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.67129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096218.67184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.67238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.67255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.67384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.67475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.69077: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096218.69092: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15500 1727096218.69105: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15500 1727096218.69126: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15500 1727096218.69144: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096218.69229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096218.69296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpflcm51jo /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/AnsiballZ_command.py <<< 15500 1727096218.69307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/AnsiballZ_command.py" <<< 15500 1727096218.69400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpflcm51jo" to remote "/root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/AnsiballZ_command.py" <<< 15500 1727096218.70303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.70337: stderr chunk (state=3): >>><<< 15500 1727096218.70379: stdout chunk (state=3): >>><<< 15500 1727096218.70426: done transferring module to remote 15500 1727096218.70443: _low_level_execute_command(): starting 15500 1727096218.70466: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/ /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/AnsiballZ_command.py && sleep 0' 15500 1727096218.71136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.71159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.71242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.71256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.71305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.71331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.71440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.73439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.73449: stdout chunk (state=3): >>><<< 15500 1727096218.73464: stderr chunk (state=3): >>><<< 15500 1727096218.73497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.73586: _low_level_execute_command(): starting 15500 1727096218.73590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/AnsiballZ_command.py && sleep 0' 15500 1727096218.74187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096218.74263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.74309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096218.74328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096218.74375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.74453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.91597: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-23 08:56:58.896562", "end": "2024-09-23 08:56:58.914150", "delta": "0:00:00.017588", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15500 1727096218.93570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096218.93574: stdout chunk (state=3): >>><<< 15500 1727096218.93576: stderr chunk (state=3): >>><<< 15500 1727096218.93579: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-23 08:56:58.896562", "end": "2024-09-23 08:56:58.914150", "delta": "0:00:00.017588", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096218.93582: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096218.93589: _low_level_execute_command(): starting 15500 1727096218.93591: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096218.6305804-16295-58172774534549/ > /dev/null 2>&1 && sleep 0' 15500 1727096218.94734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096218.94738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.94740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096218.94742: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096218.94745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096218.94882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096218.94916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096218.96991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096218.97022: stderr chunk (state=3): >>><<< 15500 1727096218.97305: stdout chunk (state=3): >>><<< 15500 1727096218.97308: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096218.97311: handler run complete 15500 1727096218.97313: Evaluated conditional (False): False 15500 1727096218.97315: attempt loop complete, returning result 15500 1727096218.97317: _execute() done 15500 1727096218.97321: dumping result to json 15500 1727096218.97323: done dumping result, returning 15500 1727096218.97326: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0afff68d-5257-877d-2da0-00000000026f] 15500 1727096218.97328: sending task result for task 0afff68d-5257-877d-2da0-00000000026f 15500 1727096218.97843: done sending task result for task 0afff68d-5257-877d-2da0-00000000026f 15500 1727096218.97846: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017588", "end": "2024-09-23 08:56:58.914150", "rc": 0, "start": "2024-09-23 08:56:58.896562" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 15500 1727096218.97952: no more pending results, returning what we have 15500 1727096218.97956: results queue empty 15500 1727096218.97959: checking for any_errors_fatal 15500 1727096218.97972: done checking for any_errors_fatal 15500 1727096218.97973: checking for max_fail_percentage 15500 1727096218.97975: done checking for max_fail_percentage 15500 1727096218.97976: checking to see if all hosts have failed and the running result is not ok 15500 1727096218.97977: done checking to see if all hosts have failed 15500 1727096218.97978: getting the remaining hosts for this loop 15500 1727096218.97979: done getting the remaining hosts for this loop 15500 1727096218.97983: getting the next task for host managed_node1 15500 1727096218.97991: done getting next task for host managed_node1 15500 1727096218.97993: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15500 1727096218.97997: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096218.98002: getting variables 15500 1727096218.98003: in VariableManager get_vars() 15500 1727096218.98033: Calling all_inventory to load vars for managed_node1 15500 1727096218.98036: Calling groups_inventory to load vars for managed_node1 15500 1727096218.98040: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096218.98050: Calling all_plugins_play to load vars for managed_node1 15500 1727096218.98053: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096218.98056: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.00982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.04180: done with get_vars() 15500 1727096219.04207: done getting variables 15500 1727096219.04471: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:56:59 -0400 (0:00:00.467) 0:00:19.088 ****** 15500 1727096219.04504: entering _queue_task() for managed_node1/set_fact 15500 1727096219.04947: worker is 1 (out of 1 available) 15500 1727096219.04962: exiting _queue_task() for managed_node1/set_fact 15500 1727096219.04978: done queuing things up, now waiting for results queue to drain 15500 1727096219.04979: waiting for pending results... 15500 1727096219.05887: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15500 1727096219.05892: in run() - task 0afff68d-5257-877d-2da0-000000000270 15500 1727096219.05896: variable 'ansible_search_path' from source: unknown 15500 1727096219.05898: variable 'ansible_search_path' from source: unknown 15500 1727096219.06004: calling self._execute() 15500 1727096219.06273: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.06277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.06280: variable 'omit' from source: magic vars 15500 1727096219.06995: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.07119: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.07413: variable 'nm_profile_exists' from source: set_fact 15500 1727096219.07417: Evaluated conditional (nm_profile_exists.rc == 0): True 15500 1727096219.07419: variable 'omit' from source: magic vars 15500 1727096219.07421: variable 'omit' from source: magic vars 15500 1727096219.07622: variable 'omit' from source: magic vars 15500 1727096219.07665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096219.07775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096219.07802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096219.07887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.07905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.07982: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096219.08021: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.08037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.08247: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096219.08292: Set connection var ansible_pipelining to False 15500 1727096219.08304: Set connection var ansible_timeout to 10 15500 1727096219.08392: Set connection var ansible_shell_type to sh 15500 1727096219.08396: Set connection var ansible_shell_executable to /bin/sh 15500 1727096219.08398: Set connection var ansible_connection to ssh 15500 1727096219.08400: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.08548: variable 'ansible_connection' from source: unknown 15500 1727096219.08551: variable 'ansible_module_compression' from source: unknown 15500 1727096219.08554: variable 'ansible_shell_type' from source: unknown 15500 1727096219.08556: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.08558: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.08559: variable 'ansible_pipelining' from source: unknown 15500 1727096219.08561: variable 'ansible_timeout' from source: unknown 15500 1727096219.08564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.08828: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096219.08888: variable 'omit' from source: magic vars 15500 1727096219.08900: starting attempt loop 15500 1727096219.08907: running the handler 15500 1727096219.08972: handler run complete 15500 1727096219.08983: attempt loop complete, returning result 15500 1727096219.08991: _execute() done 15500 1727096219.08998: dumping result to json 15500 1727096219.09006: done dumping result, returning 15500 1727096219.09037: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-877d-2da0-000000000270] 15500 1727096219.09053: sending task result for task 0afff68d-5257-877d-2da0-000000000270 15500 1727096219.09386: done sending task result for task 0afff68d-5257-877d-2da0-000000000270 15500 1727096219.09389: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15500 1727096219.09481: no more pending results, returning what we have 15500 1727096219.09484: results queue empty 15500 1727096219.09485: checking for any_errors_fatal 15500 1727096219.09496: done checking for any_errors_fatal 15500 1727096219.09497: checking for max_fail_percentage 15500 1727096219.09499: done checking for max_fail_percentage 15500 1727096219.09500: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.09501: done checking to see if all hosts have failed 15500 1727096219.09502: getting the remaining hosts for this loop 15500 1727096219.09504: done getting the remaining hosts for this loop 15500 1727096219.09508: getting the next task for host managed_node1 15500 1727096219.09519: done getting next task for host managed_node1 15500 1727096219.09523: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15500 1727096219.09528: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.09532: getting variables 15500 1727096219.09534: in VariableManager get_vars() 15500 1727096219.09566: Calling all_inventory to load vars for managed_node1 15500 1727096219.09788: Calling groups_inventory to load vars for managed_node1 15500 1727096219.09793: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.09803: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.09806: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.09810: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.11910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.14078: done with get_vars() 15500 1727096219.14111: done getting variables 15500 1727096219.14277: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096219.14459: variable 'profile' from source: play vars 15500 1727096219.14463: variable 'interface' from source: set_fact 15500 1727096219.14526: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:56:59 -0400 (0:00:00.100) 0:00:19.188 ****** 15500 1727096219.14564: entering _queue_task() for managed_node1/command 15500 1727096219.14925: worker is 1 (out of 1 available) 15500 1727096219.14937: exiting _queue_task() for managed_node1/command 15500 1727096219.14950: done queuing things up, now waiting for results queue to drain 15500 1727096219.14952: waiting for pending results... 15500 1727096219.15216: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15500 1727096219.15421: in run() - task 0afff68d-5257-877d-2da0-000000000272 15500 1727096219.15426: variable 'ansible_search_path' from source: unknown 15500 1727096219.15428: variable 'ansible_search_path' from source: unknown 15500 1727096219.15431: calling self._execute() 15500 1727096219.15506: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.15527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.15546: variable 'omit' from source: magic vars 15500 1727096219.15964: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.15969: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.16106: variable 'profile_stat' from source: set_fact 15500 1727096219.16126: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096219.16183: when evaluation is False, skipping this task 15500 1727096219.16186: _execute() done 15500 1727096219.16188: dumping result to json 15500 1727096219.16190: done dumping result, returning 15500 1727096219.16198: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000272] 15500 1727096219.16201: sending task result for task 0afff68d-5257-877d-2da0-000000000272 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096219.16436: no more pending results, returning what we have 15500 1727096219.16440: results queue empty 15500 1727096219.16441: checking for any_errors_fatal 15500 1727096219.16450: done checking for any_errors_fatal 15500 1727096219.16451: checking for max_fail_percentage 15500 1727096219.16453: done checking for max_fail_percentage 15500 1727096219.16454: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.16455: done checking to see if all hosts have failed 15500 1727096219.16456: getting the remaining hosts for this loop 15500 1727096219.16459: done getting the remaining hosts for this loop 15500 1727096219.16463: getting the next task for host managed_node1 15500 1727096219.16473: done getting next task for host managed_node1 15500 1727096219.16476: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15500 1727096219.16480: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.16484: getting variables 15500 1727096219.16486: in VariableManager get_vars() 15500 1727096219.16745: Calling all_inventory to load vars for managed_node1 15500 1727096219.16749: Calling groups_inventory to load vars for managed_node1 15500 1727096219.16752: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.16771: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.16774: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.16778: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.17383: done sending task result for task 0afff68d-5257-877d-2da0-000000000272 15500 1727096219.17387: WORKER PROCESS EXITING 15500 1727096219.18299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.20102: done with get_vars() 15500 1727096219.20126: done getting variables 15500 1727096219.20201: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096219.20320: variable 'profile' from source: play vars 15500 1727096219.20324: variable 'interface' from source: set_fact 15500 1727096219.20384: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:56:59 -0400 (0:00:00.058) 0:00:19.247 ****** 15500 1727096219.20420: entering _queue_task() for managed_node1/set_fact 15500 1727096219.20794: worker is 1 (out of 1 available) 15500 1727096219.20807: exiting _queue_task() for managed_node1/set_fact 15500 1727096219.20819: done queuing things up, now waiting for results queue to drain 15500 1727096219.20820: waiting for pending results... 15500 1727096219.21112: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15500 1727096219.21245: in run() - task 0afff68d-5257-877d-2da0-000000000273 15500 1727096219.21264: variable 'ansible_search_path' from source: unknown 15500 1727096219.21273: variable 'ansible_search_path' from source: unknown 15500 1727096219.21319: calling self._execute() 15500 1727096219.21414: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.21425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.21441: variable 'omit' from source: magic vars 15500 1727096219.21834: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.21852: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.21986: variable 'profile_stat' from source: set_fact 15500 1727096219.22004: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096219.22011: when evaluation is False, skipping this task 15500 1727096219.22017: _execute() done 15500 1727096219.22023: dumping result to json 15500 1727096219.22029: done dumping result, returning 15500 1727096219.22047: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000273] 15500 1727096219.22060: sending task result for task 0afff68d-5257-877d-2da0-000000000273 15500 1727096219.22223: done sending task result for task 0afff68d-5257-877d-2da0-000000000273 15500 1727096219.22226: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096219.22309: no more pending results, returning what we have 15500 1727096219.22313: results queue empty 15500 1727096219.22314: checking for any_errors_fatal 15500 1727096219.22322: done checking for any_errors_fatal 15500 1727096219.22323: checking for max_fail_percentage 15500 1727096219.22325: done checking for max_fail_percentage 15500 1727096219.22326: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.22327: done checking to see if all hosts have failed 15500 1727096219.22328: getting the remaining hosts for this loop 15500 1727096219.22330: done getting the remaining hosts for this loop 15500 1727096219.22333: getting the next task for host managed_node1 15500 1727096219.22341: done getting next task for host managed_node1 15500 1727096219.22344: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15500 1727096219.22348: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.22353: getting variables 15500 1727096219.22355: in VariableManager get_vars() 15500 1727096219.22392: Calling all_inventory to load vars for managed_node1 15500 1727096219.22394: Calling groups_inventory to load vars for managed_node1 15500 1727096219.22399: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.22412: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.22414: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.22417: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.24436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.27782: done with get_vars() 15500 1727096219.27815: done getting variables 15500 1727096219.28089: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096219.28203: variable 'profile' from source: play vars 15500 1727096219.28207: variable 'interface' from source: set_fact 15500 1727096219.28469: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:56:59 -0400 (0:00:00.080) 0:00:19.328 ****** 15500 1727096219.28502: entering _queue_task() for managed_node1/command 15500 1727096219.29064: worker is 1 (out of 1 available) 15500 1727096219.29279: exiting _queue_task() for managed_node1/command 15500 1727096219.29292: done queuing things up, now waiting for results queue to drain 15500 1727096219.29293: waiting for pending results... 15500 1727096219.29683: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15500 1727096219.29689: in run() - task 0afff68d-5257-877d-2da0-000000000274 15500 1727096219.29692: variable 'ansible_search_path' from source: unknown 15500 1727096219.29695: variable 'ansible_search_path' from source: unknown 15500 1727096219.30023: calling self._execute() 15500 1727096219.30248: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.30252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.30255: variable 'omit' from source: magic vars 15500 1727096219.31073: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.31077: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.31272: variable 'profile_stat' from source: set_fact 15500 1727096219.31286: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096219.31410: when evaluation is False, skipping this task 15500 1727096219.31414: _execute() done 15500 1727096219.31416: dumping result to json 15500 1727096219.31419: done dumping result, returning 15500 1727096219.31428: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000274] 15500 1727096219.31442: sending task result for task 0afff68d-5257-877d-2da0-000000000274 15500 1727096219.31529: done sending task result for task 0afff68d-5257-877d-2da0-000000000274 15500 1727096219.31532: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096219.31600: no more pending results, returning what we have 15500 1727096219.31604: results queue empty 15500 1727096219.31604: checking for any_errors_fatal 15500 1727096219.31614: done checking for any_errors_fatal 15500 1727096219.31615: checking for max_fail_percentage 15500 1727096219.31616: done checking for max_fail_percentage 15500 1727096219.31617: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.31618: done checking to see if all hosts have failed 15500 1727096219.31619: getting the remaining hosts for this loop 15500 1727096219.31620: done getting the remaining hosts for this loop 15500 1727096219.31624: getting the next task for host managed_node1 15500 1727096219.31631: done getting next task for host managed_node1 15500 1727096219.31633: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15500 1727096219.31637: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.31640: getting variables 15500 1727096219.31641: in VariableManager get_vars() 15500 1727096219.31674: Calling all_inventory to load vars for managed_node1 15500 1727096219.31676: Calling groups_inventory to load vars for managed_node1 15500 1727096219.31680: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.31692: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.31694: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.31697: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.33838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.36072: done with get_vars() 15500 1727096219.36109: done getting variables 15500 1727096219.36322: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096219.36637: variable 'profile' from source: play vars 15500 1727096219.36641: variable 'interface' from source: set_fact 15500 1727096219.36852: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:56:59 -0400 (0:00:00.083) 0:00:19.412 ****** 15500 1727096219.36890: entering _queue_task() for managed_node1/set_fact 15500 1727096219.37666: worker is 1 (out of 1 available) 15500 1727096219.37842: exiting _queue_task() for managed_node1/set_fact 15500 1727096219.37854: done queuing things up, now waiting for results queue to drain 15500 1727096219.37855: waiting for pending results... 15500 1727096219.38292: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15500 1727096219.38302: in run() - task 0afff68d-5257-877d-2da0-000000000275 15500 1727096219.38306: variable 'ansible_search_path' from source: unknown 15500 1727096219.38309: variable 'ansible_search_path' from source: unknown 15500 1727096219.38312: calling self._execute() 15500 1727096219.38315: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.38373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.38378: variable 'omit' from source: magic vars 15500 1727096219.39212: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.39216: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.39537: variable 'profile_stat' from source: set_fact 15500 1727096219.39554: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096219.39560: when evaluation is False, skipping this task 15500 1727096219.39563: _execute() done 15500 1727096219.39566: dumping result to json 15500 1727096219.39571: done dumping result, returning 15500 1727096219.39682: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000275] 15500 1727096219.39689: sending task result for task 0afff68d-5257-877d-2da0-000000000275 15500 1727096219.39799: done sending task result for task 0afff68d-5257-877d-2da0-000000000275 15500 1727096219.39803: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096219.39910: no more pending results, returning what we have 15500 1727096219.39915: results queue empty 15500 1727096219.39916: checking for any_errors_fatal 15500 1727096219.39926: done checking for any_errors_fatal 15500 1727096219.39927: checking for max_fail_percentage 15500 1727096219.39929: done checking for max_fail_percentage 15500 1727096219.39930: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.39931: done checking to see if all hosts have failed 15500 1727096219.39932: getting the remaining hosts for this loop 15500 1727096219.39934: done getting the remaining hosts for this loop 15500 1727096219.39939: getting the next task for host managed_node1 15500 1727096219.39949: done getting next task for host managed_node1 15500 1727096219.39952: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15500 1727096219.39956: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.39961: getting variables 15500 1727096219.39963: in VariableManager get_vars() 15500 1727096219.40188: Calling all_inventory to load vars for managed_node1 15500 1727096219.40192: Calling groups_inventory to load vars for managed_node1 15500 1727096219.40196: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.40211: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.40271: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.40278: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.43190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.45110: done with get_vars() 15500 1727096219.45149: done getting variables 15500 1727096219.45388: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096219.45616: variable 'profile' from source: play vars 15500 1727096219.45621: variable 'interface' from source: set_fact 15500 1727096219.45704: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:56:59 -0400 (0:00:00.088) 0:00:19.500 ****** 15500 1727096219.45737: entering _queue_task() for managed_node1/assert 15500 1727096219.46140: worker is 1 (out of 1 available) 15500 1727096219.46153: exiting _queue_task() for managed_node1/assert 15500 1727096219.46316: done queuing things up, now waiting for results queue to drain 15500 1727096219.46318: waiting for pending results... 15500 1727096219.46542: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' 15500 1727096219.46549: in run() - task 0afff68d-5257-877d-2da0-000000000260 15500 1727096219.46712: variable 'ansible_search_path' from source: unknown 15500 1727096219.46716: variable 'ansible_search_path' from source: unknown 15500 1727096219.46755: calling self._execute() 15500 1727096219.46837: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.46840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.46856: variable 'omit' from source: magic vars 15500 1727096219.47151: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.47164: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.47173: variable 'omit' from source: magic vars 15500 1727096219.47202: variable 'omit' from source: magic vars 15500 1727096219.47280: variable 'profile' from source: play vars 15500 1727096219.47284: variable 'interface' from source: set_fact 15500 1727096219.47333: variable 'interface' from source: set_fact 15500 1727096219.47344: variable 'omit' from source: magic vars 15500 1727096219.47382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096219.47411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096219.47428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096219.47444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.47453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.47483: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096219.47486: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.47489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.47560: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096219.47568: Set connection var ansible_pipelining to False 15500 1727096219.47574: Set connection var ansible_timeout to 10 15500 1727096219.47576: Set connection var ansible_shell_type to sh 15500 1727096219.47581: Set connection var ansible_shell_executable to /bin/sh 15500 1727096219.47586: Set connection var ansible_connection to ssh 15500 1727096219.47603: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.47606: variable 'ansible_connection' from source: unknown 15500 1727096219.47611: variable 'ansible_module_compression' from source: unknown 15500 1727096219.47613: variable 'ansible_shell_type' from source: unknown 15500 1727096219.47616: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.47619: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.47621: variable 'ansible_pipelining' from source: unknown 15500 1727096219.47623: variable 'ansible_timeout' from source: unknown 15500 1727096219.47625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.47731: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096219.47740: variable 'omit' from source: magic vars 15500 1727096219.47746: starting attempt loop 15500 1727096219.47748: running the handler 15500 1727096219.47833: variable 'lsr_net_profile_exists' from source: set_fact 15500 1727096219.47837: Evaluated conditional (lsr_net_profile_exists): True 15500 1727096219.47844: handler run complete 15500 1727096219.47857: attempt loop complete, returning result 15500 1727096219.47859: _execute() done 15500 1727096219.47862: dumping result to json 15500 1727096219.47865: done dumping result, returning 15500 1727096219.47876: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' [0afff68d-5257-877d-2da0-000000000260] 15500 1727096219.47881: sending task result for task 0afff68d-5257-877d-2da0-000000000260 15500 1727096219.47960: done sending task result for task 0afff68d-5257-877d-2da0-000000000260 15500 1727096219.47962: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15500 1727096219.48021: no more pending results, returning what we have 15500 1727096219.48024: results queue empty 15500 1727096219.48025: checking for any_errors_fatal 15500 1727096219.48032: done checking for any_errors_fatal 15500 1727096219.48033: checking for max_fail_percentage 15500 1727096219.48034: done checking for max_fail_percentage 15500 1727096219.48035: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.48036: done checking to see if all hosts have failed 15500 1727096219.48037: getting the remaining hosts for this loop 15500 1727096219.48038: done getting the remaining hosts for this loop 15500 1727096219.48042: getting the next task for host managed_node1 15500 1727096219.48049: done getting next task for host managed_node1 15500 1727096219.48052: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15500 1727096219.48055: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.48059: getting variables 15500 1727096219.48060: in VariableManager get_vars() 15500 1727096219.48092: Calling all_inventory to load vars for managed_node1 15500 1727096219.48095: Calling groups_inventory to load vars for managed_node1 15500 1727096219.48098: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.48109: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.48111: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.48113: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.49908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.51298: done with get_vars() 15500 1727096219.51336: done getting variables 15500 1727096219.51422: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096219.51563: variable 'profile' from source: play vars 15500 1727096219.51569: variable 'interface' from source: set_fact 15500 1727096219.51640: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:56:59 -0400 (0:00:00.059) 0:00:19.559 ****** 15500 1727096219.51686: entering _queue_task() for managed_node1/assert 15500 1727096219.52326: worker is 1 (out of 1 available) 15500 1727096219.52340: exiting _queue_task() for managed_node1/assert 15500 1727096219.52353: done queuing things up, now waiting for results queue to drain 15500 1727096219.52355: waiting for pending results... 15500 1727096219.52591: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 15500 1727096219.52600: in run() - task 0afff68d-5257-877d-2da0-000000000261 15500 1727096219.52605: variable 'ansible_search_path' from source: unknown 15500 1727096219.52611: variable 'ansible_search_path' from source: unknown 15500 1727096219.52666: calling self._execute() 15500 1727096219.52787: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.52805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.52828: variable 'omit' from source: magic vars 15500 1727096219.53258: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.53283: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.53305: variable 'omit' from source: magic vars 15500 1727096219.53365: variable 'omit' from source: magic vars 15500 1727096219.53500: variable 'profile' from source: play vars 15500 1727096219.53534: variable 'interface' from source: set_fact 15500 1727096219.53633: variable 'interface' from source: set_fact 15500 1727096219.53636: variable 'omit' from source: magic vars 15500 1727096219.53684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096219.53749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096219.53752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096219.53777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.53796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.53838: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096219.53841: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.53843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.54023: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096219.54029: Set connection var ansible_pipelining to False 15500 1727096219.54031: Set connection var ansible_timeout to 10 15500 1727096219.54036: Set connection var ansible_shell_type to sh 15500 1727096219.54038: Set connection var ansible_shell_executable to /bin/sh 15500 1727096219.54043: Set connection var ansible_connection to ssh 15500 1727096219.54146: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.54150: variable 'ansible_connection' from source: unknown 15500 1727096219.54152: variable 'ansible_module_compression' from source: unknown 15500 1727096219.54158: variable 'ansible_shell_type' from source: unknown 15500 1727096219.54161: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.54163: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.54165: variable 'ansible_pipelining' from source: unknown 15500 1727096219.54170: variable 'ansible_timeout' from source: unknown 15500 1727096219.54173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.54395: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096219.54399: variable 'omit' from source: magic vars 15500 1727096219.54401: starting attempt loop 15500 1727096219.54403: running the handler 15500 1727096219.54537: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15500 1727096219.54564: Evaluated conditional (lsr_net_profile_ansible_managed): True 15500 1727096219.54573: handler run complete 15500 1727096219.54579: attempt loop complete, returning result 15500 1727096219.54584: _execute() done 15500 1727096219.54586: dumping result to json 15500 1727096219.54591: done dumping result, returning 15500 1727096219.54594: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [0afff68d-5257-877d-2da0-000000000261] 15500 1727096219.54596: sending task result for task 0afff68d-5257-877d-2da0-000000000261 15500 1727096219.54673: done sending task result for task 0afff68d-5257-877d-2da0-000000000261 15500 1727096219.54679: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15500 1727096219.54769: no more pending results, returning what we have 15500 1727096219.54773: results queue empty 15500 1727096219.54773: checking for any_errors_fatal 15500 1727096219.54781: done checking for any_errors_fatal 15500 1727096219.54783: checking for max_fail_percentage 15500 1727096219.54786: done checking for max_fail_percentage 15500 1727096219.54787: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.54788: done checking to see if all hosts have failed 15500 1727096219.54789: getting the remaining hosts for this loop 15500 1727096219.54790: done getting the remaining hosts for this loop 15500 1727096219.54793: getting the next task for host managed_node1 15500 1727096219.54799: done getting next task for host managed_node1 15500 1727096219.54802: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15500 1727096219.54807: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.54811: getting variables 15500 1727096219.54814: in VariableManager get_vars() 15500 1727096219.54844: Calling all_inventory to load vars for managed_node1 15500 1727096219.54847: Calling groups_inventory to load vars for managed_node1 15500 1727096219.54850: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.54863: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.54865: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.54961: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.56123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.57587: done with get_vars() 15500 1727096219.57617: done getting variables 15500 1727096219.57691: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096219.57804: variable 'profile' from source: play vars 15500 1727096219.57808: variable 'interface' from source: set_fact 15500 1727096219.57879: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:56:59 -0400 (0:00:00.062) 0:00:19.622 ****** 15500 1727096219.57927: entering _queue_task() for managed_node1/assert 15500 1727096219.58301: worker is 1 (out of 1 available) 15500 1727096219.58314: exiting _queue_task() for managed_node1/assert 15500 1727096219.58329: done queuing things up, now waiting for results queue to drain 15500 1727096219.58331: waiting for pending results... 15500 1727096219.58571: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 15500 1727096219.58657: in run() - task 0afff68d-5257-877d-2da0-000000000262 15500 1727096219.58701: variable 'ansible_search_path' from source: unknown 15500 1727096219.58705: variable 'ansible_search_path' from source: unknown 15500 1727096219.58730: calling self._execute() 15500 1727096219.58851: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.58855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.58861: variable 'omit' from source: magic vars 15500 1727096219.59255: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.59258: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.59262: variable 'omit' from source: magic vars 15500 1727096219.59316: variable 'omit' from source: magic vars 15500 1727096219.59392: variable 'profile' from source: play vars 15500 1727096219.59395: variable 'interface' from source: set_fact 15500 1727096219.59476: variable 'interface' from source: set_fact 15500 1727096219.59513: variable 'omit' from source: magic vars 15500 1727096219.59560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096219.59591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096219.59610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096219.59625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.59635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.59672: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096219.59676: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.59678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.59782: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096219.59789: Set connection var ansible_pipelining to False 15500 1727096219.59792: Set connection var ansible_timeout to 10 15500 1727096219.59795: Set connection var ansible_shell_type to sh 15500 1727096219.59838: Set connection var ansible_shell_executable to /bin/sh 15500 1727096219.59841: Set connection var ansible_connection to ssh 15500 1727096219.59844: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.59846: variable 'ansible_connection' from source: unknown 15500 1727096219.59849: variable 'ansible_module_compression' from source: unknown 15500 1727096219.59851: variable 'ansible_shell_type' from source: unknown 15500 1727096219.59853: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.59855: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.59860: variable 'ansible_pipelining' from source: unknown 15500 1727096219.59863: variable 'ansible_timeout' from source: unknown 15500 1727096219.59866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.60107: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096219.60111: variable 'omit' from source: magic vars 15500 1727096219.60114: starting attempt loop 15500 1727096219.60116: running the handler 15500 1727096219.60224: variable 'lsr_net_profile_fingerprint' from source: set_fact 15500 1727096219.60228: Evaluated conditional (lsr_net_profile_fingerprint): True 15500 1727096219.60234: handler run complete 15500 1727096219.60239: attempt loop complete, returning result 15500 1727096219.60242: _execute() done 15500 1727096219.60244: dumping result to json 15500 1727096219.60250: done dumping result, returning 15500 1727096219.60252: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000262] 15500 1727096219.60327: sending task result for task 0afff68d-5257-877d-2da0-000000000262 15500 1727096219.60397: done sending task result for task 0afff68d-5257-877d-2da0-000000000262 15500 1727096219.60399: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15500 1727096219.60530: no more pending results, returning what we have 15500 1727096219.60533: results queue empty 15500 1727096219.60534: checking for any_errors_fatal 15500 1727096219.60539: done checking for any_errors_fatal 15500 1727096219.60540: checking for max_fail_percentage 15500 1727096219.60542: done checking for max_fail_percentage 15500 1727096219.60543: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.60545: done checking to see if all hosts have failed 15500 1727096219.60546: getting the remaining hosts for this loop 15500 1727096219.60548: done getting the remaining hosts for this loop 15500 1727096219.60553: getting the next task for host managed_node1 15500 1727096219.60564: done getting next task for host managed_node1 15500 1727096219.60566: ^ task is: TASK: meta (flush_handlers) 15500 1727096219.60569: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.60573: getting variables 15500 1727096219.60574: in VariableManager get_vars() 15500 1727096219.60605: Calling all_inventory to load vars for managed_node1 15500 1727096219.60607: Calling groups_inventory to load vars for managed_node1 15500 1727096219.60611: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.60622: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.60624: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.60627: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.62123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.63363: done with get_vars() 15500 1727096219.63391: done getting variables 15500 1727096219.63456: in VariableManager get_vars() 15500 1727096219.63466: Calling all_inventory to load vars for managed_node1 15500 1727096219.63470: Calling groups_inventory to load vars for managed_node1 15500 1727096219.63471: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.63475: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.63476: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.63478: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.64507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.65828: done with get_vars() 15500 1727096219.65871: done queuing things up, now waiting for results queue to drain 15500 1727096219.65872: results queue empty 15500 1727096219.65873: checking for any_errors_fatal 15500 1727096219.65875: done checking for any_errors_fatal 15500 1727096219.65875: checking for max_fail_percentage 15500 1727096219.65876: done checking for max_fail_percentage 15500 1727096219.65883: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.65884: done checking to see if all hosts have failed 15500 1727096219.65884: getting the remaining hosts for this loop 15500 1727096219.65885: done getting the remaining hosts for this loop 15500 1727096219.65888: getting the next task for host managed_node1 15500 1727096219.65891: done getting next task for host managed_node1 15500 1727096219.65892: ^ task is: TASK: meta (flush_handlers) 15500 1727096219.65893: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.65895: getting variables 15500 1727096219.65896: in VariableManager get_vars() 15500 1727096219.65907: Calling all_inventory to load vars for managed_node1 15500 1727096219.65909: Calling groups_inventory to load vars for managed_node1 15500 1727096219.65912: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.65925: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.65928: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.65934: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.67026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.68961: done with get_vars() 15500 1727096219.69015: done getting variables 15500 1727096219.69479: in VariableManager get_vars() 15500 1727096219.69493: Calling all_inventory to load vars for managed_node1 15500 1727096219.69496: Calling groups_inventory to load vars for managed_node1 15500 1727096219.69498: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.69504: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.69510: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.69514: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.70693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.71821: done with get_vars() 15500 1727096219.71854: done queuing things up, now waiting for results queue to drain 15500 1727096219.71859: results queue empty 15500 1727096219.71860: checking for any_errors_fatal 15500 1727096219.71861: done checking for any_errors_fatal 15500 1727096219.71861: checking for max_fail_percentage 15500 1727096219.71862: done checking for max_fail_percentage 15500 1727096219.71863: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.71864: done checking to see if all hosts have failed 15500 1727096219.71864: getting the remaining hosts for this loop 15500 1727096219.71866: done getting the remaining hosts for this loop 15500 1727096219.71882: getting the next task for host managed_node1 15500 1727096219.71885: done getting next task for host managed_node1 15500 1727096219.71886: ^ task is: None 15500 1727096219.71887: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.71888: done queuing things up, now waiting for results queue to drain 15500 1727096219.71889: results queue empty 15500 1727096219.71890: checking for any_errors_fatal 15500 1727096219.71896: done checking for any_errors_fatal 15500 1727096219.71897: checking for max_fail_percentage 15500 1727096219.71898: done checking for max_fail_percentage 15500 1727096219.71899: checking to see if all hosts have failed and the running result is not ok 15500 1727096219.71899: done checking to see if all hosts have failed 15500 1727096219.71901: getting the next task for host managed_node1 15500 1727096219.71903: done getting next task for host managed_node1 15500 1727096219.71904: ^ task is: None 15500 1727096219.71908: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.71951: in VariableManager get_vars() 15500 1727096219.71981: done with get_vars() 15500 1727096219.71988: in VariableManager get_vars() 15500 1727096219.72001: done with get_vars() 15500 1727096219.72005: variable 'omit' from source: magic vars 15500 1727096219.72144: variable 'profile' from source: play vars 15500 1727096219.72285: in VariableManager get_vars() 15500 1727096219.72299: done with get_vars() 15500 1727096219.72323: variable 'omit' from source: magic vars 15500 1727096219.72408: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15500 1727096219.73148: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096219.73179: getting the remaining hosts for this loop 15500 1727096219.73180: done getting the remaining hosts for this loop 15500 1727096219.73183: getting the next task for host managed_node1 15500 1727096219.73186: done getting next task for host managed_node1 15500 1727096219.73191: ^ task is: TASK: Gathering Facts 15500 1727096219.73192: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096219.73194: getting variables 15500 1727096219.73198: in VariableManager get_vars() 15500 1727096219.73209: Calling all_inventory to load vars for managed_node1 15500 1727096219.73212: Calling groups_inventory to load vars for managed_node1 15500 1727096219.73214: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096219.73220: Calling all_plugins_play to load vars for managed_node1 15500 1727096219.73222: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096219.73225: Calling groups_plugins_play to load vars for managed_node1 15500 1727096219.74534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096219.76137: done with get_vars() 15500 1727096219.76160: done getting variables 15500 1727096219.76214: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Monday 23 September 2024 08:56:59 -0400 (0:00:00.183) 0:00:19.805 ****** 15500 1727096219.76242: entering _queue_task() for managed_node1/gather_facts 15500 1727096219.76581: worker is 1 (out of 1 available) 15500 1727096219.76592: exiting _queue_task() for managed_node1/gather_facts 15500 1727096219.76604: done queuing things up, now waiting for results queue to drain 15500 1727096219.76606: waiting for pending results... 15500 1727096219.77004: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096219.77142: in run() - task 0afff68d-5257-877d-2da0-0000000002b5 15500 1727096219.77146: variable 'ansible_search_path' from source: unknown 15500 1727096219.77198: calling self._execute() 15500 1727096219.77358: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.77362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.77365: variable 'omit' from source: magic vars 15500 1727096219.77876: variable 'ansible_distribution_major_version' from source: facts 15500 1727096219.77892: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096219.77907: variable 'omit' from source: magic vars 15500 1727096219.77938: variable 'omit' from source: magic vars 15500 1727096219.77979: variable 'omit' from source: magic vars 15500 1727096219.78118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096219.78121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096219.78124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096219.78143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.78158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096219.78250: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096219.78335: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.78347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.78598: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096219.78602: Set connection var ansible_pipelining to False 15500 1727096219.78604: Set connection var ansible_timeout to 10 15500 1727096219.78606: Set connection var ansible_shell_type to sh 15500 1727096219.78609: Set connection var ansible_shell_executable to /bin/sh 15500 1727096219.78611: Set connection var ansible_connection to ssh 15500 1727096219.78662: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.78672: variable 'ansible_connection' from source: unknown 15500 1727096219.78679: variable 'ansible_module_compression' from source: unknown 15500 1727096219.78730: variable 'ansible_shell_type' from source: unknown 15500 1727096219.78769: variable 'ansible_shell_executable' from source: unknown 15500 1727096219.78772: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096219.78774: variable 'ansible_pipelining' from source: unknown 15500 1727096219.78777: variable 'ansible_timeout' from source: unknown 15500 1727096219.78779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096219.79093: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096219.79097: variable 'omit' from source: magic vars 15500 1727096219.79099: starting attempt loop 15500 1727096219.79102: running the handler 15500 1727096219.79104: variable 'ansible_facts' from source: unknown 15500 1727096219.79106: _low_level_execute_command(): starting 15500 1727096219.79107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096219.80586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096219.80610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096219.80632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096219.80651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096219.80759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096219.82532: stdout chunk (state=3): >>>/root <<< 15500 1727096219.82710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096219.82714: stdout chunk (state=3): >>><<< 15500 1727096219.82716: stderr chunk (state=3): >>><<< 15500 1727096219.82774: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096219.82779: _low_level_execute_command(): starting 15500 1727096219.82782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523 `" && echo ansible-tmp-1727096219.8274157-16345-125203750798523="` echo /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523 `" ) && sleep 0' 15500 1727096219.84008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096219.84012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096219.84015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096219.84028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096219.84090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096219.84094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096219.84099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096219.84183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096219.86171: stdout chunk (state=3): >>>ansible-tmp-1727096219.8274157-16345-125203750798523=/root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523 <<< 15500 1727096219.86396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096219.86774: stderr chunk (state=3): >>><<< 15500 1727096219.86778: stdout chunk (state=3): >>><<< 15500 1727096219.86781: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096219.8274157-16345-125203750798523=/root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096219.86784: variable 'ansible_module_compression' from source: unknown 15500 1727096219.86786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096219.86788: variable 'ansible_facts' from source: unknown 15500 1727096219.87063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/AnsiballZ_setup.py 15500 1727096219.87293: Sending initial data 15500 1727096219.87305: Sent initial data (154 bytes) 15500 1727096219.87919: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096219.87936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096219.87982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096219.87992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096219.88078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096219.88108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096219.88135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096219.88170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096219.88263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096219.89962: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096219.90059: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096219.90115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpm7kr_d7t /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/AnsiballZ_setup.py <<< 15500 1727096219.90124: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/AnsiballZ_setup.py" <<< 15500 1727096219.90265: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpm7kr_d7t" to remote "/root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/AnsiballZ_setup.py" <<< 15500 1727096219.93378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096219.93383: stdout chunk (state=3): >>><<< 15500 1727096219.93385: stderr chunk (state=3): >>><<< 15500 1727096219.93387: done transferring module to remote 15500 1727096219.93390: _low_level_execute_command(): starting 15500 1727096219.93392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/ /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/AnsiballZ_setup.py && sleep 0' 15500 1727096219.94751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096219.94908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096219.95079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096219.95165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096219.97346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096219.97350: stderr chunk (state=3): >>><<< 15500 1727096219.97353: stdout chunk (state=3): >>><<< 15500 1727096219.97356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096219.97361: _low_level_execute_command(): starting 15500 1727096219.97364: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/AnsiballZ_setup.py && sleep 0' 15500 1727096219.98439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096219.98464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096219.98577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096219.98632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096219.98645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096219.98774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096219.98865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096220.65986: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.4541015625, "5m": 0.32470703125, "15m": 0.1513671875}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "00", "epoch": "1727096220", "epoch_int": "1727096220", "date": "2024-09-23", "time": "08:57:00", "iso8601_micro": "2024-09-23T12:57:00.271942Z", "iso8601": "2024-09-23T12:57:00Z", "iso8601_basic": "20240923T085700271942", "iso8601_basic_short": "20240923T085700", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 373, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797654528, "block_size": 4096, "block_total": 65519099, "block_available": 63915443, "block_used": 1603656, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "<<< 15500 1727096220.66001: stdout chunk (state=3): >>>loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1a:3f:d0:99:f4:7d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096220.68123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096220.68174: stderr chunk (state=3): >>><<< 15500 1727096220.68177: stdout chunk (state=3): >>><<< 15500 1727096220.68258: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.4541015625, "5m": 0.32470703125, "15m": 0.1513671875}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "00", "epoch": "1727096220", "epoch_int": "1727096220", "date": "2024-09-23", "time": "08:57:00", "iso8601_micro": "2024-09-23T12:57:00.271942Z", "iso8601": "2024-09-23T12:57:00Z", "iso8601_basic": "20240923T085700271942", "iso8601_basic_short": "20240923T085700", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 373, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797654528, "block_size": 4096, "block_total": 65519099, "block_available": 63915443, "block_used": 1603656, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1a:3f:d0:99:f4:7d", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096220.68705: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096220.68728: _low_level_execute_command(): starting 15500 1727096220.68793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096219.8274157-16345-125203750798523/ > /dev/null 2>&1 && sleep 0' 15500 1727096220.69317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096220.69320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096220.69330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15500 1727096220.69335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096220.69338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096220.69396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096220.69400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096220.69464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096220.71389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096220.71575: stderr chunk (state=3): >>><<< 15500 1727096220.71581: stdout chunk (state=3): >>><<< 15500 1727096220.71584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096220.71587: handler run complete 15500 1727096220.71681: variable 'ansible_facts' from source: unknown 15500 1727096220.71762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096220.71956: variable 'ansible_facts' from source: unknown 15500 1727096220.72176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096220.72240: attempt loop complete, returning result 15500 1727096220.72255: _execute() done 15500 1727096220.72269: dumping result to json 15500 1727096220.72316: done dumping result, returning 15500 1727096220.72375: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-0000000002b5] 15500 1727096220.72379: sending task result for task 0afff68d-5257-877d-2da0-0000000002b5 ok: [managed_node1] 15500 1727096220.73291: no more pending results, returning what we have 15500 1727096220.73293: results queue empty 15500 1727096220.73294: checking for any_errors_fatal 15500 1727096220.73295: done checking for any_errors_fatal 15500 1727096220.73295: checking for max_fail_percentage 15500 1727096220.73297: done checking for max_fail_percentage 15500 1727096220.73298: checking to see if all hosts have failed and the running result is not ok 15500 1727096220.73298: done checking to see if all hosts have failed 15500 1727096220.73299: getting the remaining hosts for this loop 15500 1727096220.73300: done getting the remaining hosts for this loop 15500 1727096220.73303: getting the next task for host managed_node1 15500 1727096220.73307: done getting next task for host managed_node1 15500 1727096220.73308: ^ task is: TASK: meta (flush_handlers) 15500 1727096220.73309: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096220.73312: getting variables 15500 1727096220.73313: in VariableManager get_vars() 15500 1727096220.73333: Calling all_inventory to load vars for managed_node1 15500 1727096220.73335: Calling groups_inventory to load vars for managed_node1 15500 1727096220.73337: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096220.73346: Calling all_plugins_play to load vars for managed_node1 15500 1727096220.73347: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096220.73350: Calling groups_plugins_play to load vars for managed_node1 15500 1727096220.73875: done sending task result for task 0afff68d-5257-877d-2da0-0000000002b5 15500 1727096220.73879: WORKER PROCESS EXITING 15500 1727096220.74483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096220.75971: done with get_vars() 15500 1727096220.75990: done getting variables 15500 1727096220.76042: in VariableManager get_vars() 15500 1727096220.76052: Calling all_inventory to load vars for managed_node1 15500 1727096220.76054: Calling groups_inventory to load vars for managed_node1 15500 1727096220.76055: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096220.76061: Calling all_plugins_play to load vars for managed_node1 15500 1727096220.76064: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096220.76066: Calling groups_plugins_play to load vars for managed_node1 15500 1727096220.80901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096220.82410: done with get_vars() 15500 1727096220.82440: done queuing things up, now waiting for results queue to drain 15500 1727096220.82443: results queue empty 15500 1727096220.82444: checking for any_errors_fatal 15500 1727096220.82447: done checking for any_errors_fatal 15500 1727096220.82448: checking for max_fail_percentage 15500 1727096220.82454: done checking for max_fail_percentage 15500 1727096220.82455: checking to see if all hosts have failed and the running result is not ok 15500 1727096220.82456: done checking to see if all hosts have failed 15500 1727096220.82459: getting the remaining hosts for this loop 15500 1727096220.82461: done getting the remaining hosts for this loop 15500 1727096220.82464: getting the next task for host managed_node1 15500 1727096220.82469: done getting next task for host managed_node1 15500 1727096220.82472: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15500 1727096220.82474: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096220.82484: getting variables 15500 1727096220.82485: in VariableManager get_vars() 15500 1727096220.82501: Calling all_inventory to load vars for managed_node1 15500 1727096220.82503: Calling groups_inventory to load vars for managed_node1 15500 1727096220.82505: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096220.82510: Calling all_plugins_play to load vars for managed_node1 15500 1727096220.82513: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096220.82515: Calling groups_plugins_play to load vars for managed_node1 15500 1727096220.83689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096220.85222: done with get_vars() 15500 1727096220.85243: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:57:00 -0400 (0:00:01.090) 0:00:20.896 ****** 15500 1727096220.85323: entering _queue_task() for managed_node1/include_tasks 15500 1727096220.85718: worker is 1 (out of 1 available) 15500 1727096220.85730: exiting _queue_task() for managed_node1/include_tasks 15500 1727096220.85741: done queuing things up, now waiting for results queue to drain 15500 1727096220.85742: waiting for pending results... 15500 1727096220.86188: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15500 1727096220.86193: in run() - task 0afff68d-5257-877d-2da0-00000000003a 15500 1727096220.86197: variable 'ansible_search_path' from source: unknown 15500 1727096220.86201: variable 'ansible_search_path' from source: unknown 15500 1727096220.86235: calling self._execute() 15500 1727096220.86339: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096220.86351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096220.86373: variable 'omit' from source: magic vars 15500 1727096220.86779: variable 'ansible_distribution_major_version' from source: facts 15500 1727096220.86795: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096220.86807: _execute() done 15500 1727096220.86815: dumping result to json 15500 1727096220.86823: done dumping result, returning 15500 1727096220.86835: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-877d-2da0-00000000003a] 15500 1727096220.86845: sending task result for task 0afff68d-5257-877d-2da0-00000000003a 15500 1727096220.87004: no more pending results, returning what we have 15500 1727096220.87009: in VariableManager get_vars() 15500 1727096220.87055: Calling all_inventory to load vars for managed_node1 15500 1727096220.87061: Calling groups_inventory to load vars for managed_node1 15500 1727096220.87064: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096220.87078: Calling all_plugins_play to load vars for managed_node1 15500 1727096220.87082: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096220.87085: Calling groups_plugins_play to load vars for managed_node1 15500 1727096220.87780: done sending task result for task 0afff68d-5257-877d-2da0-00000000003a 15500 1727096220.87783: WORKER PROCESS EXITING 15500 1727096220.88724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096220.90337: done with get_vars() 15500 1727096220.90360: variable 'ansible_search_path' from source: unknown 15500 1727096220.90361: variable 'ansible_search_path' from source: unknown 15500 1727096220.90392: we have included files to process 15500 1727096220.90393: generating all_blocks data 15500 1727096220.90395: done generating all_blocks data 15500 1727096220.90396: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096220.90397: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096220.90399: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096220.90993: done processing included file 15500 1727096220.90995: iterating over new_blocks loaded from include file 15500 1727096220.90997: in VariableManager get_vars() 15500 1727096220.91017: done with get_vars() 15500 1727096220.91019: filtering new block on tags 15500 1727096220.91035: done filtering new block on tags 15500 1727096220.91038: in VariableManager get_vars() 15500 1727096220.91056: done with get_vars() 15500 1727096220.91060: filtering new block on tags 15500 1727096220.91080: done filtering new block on tags 15500 1727096220.91082: in VariableManager get_vars() 15500 1727096220.91101: done with get_vars() 15500 1727096220.91103: filtering new block on tags 15500 1727096220.91118: done filtering new block on tags 15500 1727096220.91121: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15500 1727096220.91126: extending task lists for all hosts with included blocks 15500 1727096220.91463: done extending task lists 15500 1727096220.91465: done processing included files 15500 1727096220.91466: results queue empty 15500 1727096220.91467: checking for any_errors_fatal 15500 1727096220.91470: done checking for any_errors_fatal 15500 1727096220.91471: checking for max_fail_percentage 15500 1727096220.91472: done checking for max_fail_percentage 15500 1727096220.91473: checking to see if all hosts have failed and the running result is not ok 15500 1727096220.91474: done checking to see if all hosts have failed 15500 1727096220.91475: getting the remaining hosts for this loop 15500 1727096220.91476: done getting the remaining hosts for this loop 15500 1727096220.91478: getting the next task for host managed_node1 15500 1727096220.91482: done getting next task for host managed_node1 15500 1727096220.91485: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15500 1727096220.91487: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096220.91496: getting variables 15500 1727096220.91497: in VariableManager get_vars() 15500 1727096220.91512: Calling all_inventory to load vars for managed_node1 15500 1727096220.91514: Calling groups_inventory to load vars for managed_node1 15500 1727096220.91516: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096220.91521: Calling all_plugins_play to load vars for managed_node1 15500 1727096220.91524: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096220.91527: Calling groups_plugins_play to load vars for managed_node1 15500 1727096220.92721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096220.94341: done with get_vars() 15500 1727096220.94364: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:57:00 -0400 (0:00:00.091) 0:00:20.987 ****** 15500 1727096220.94448: entering _queue_task() for managed_node1/setup 15500 1727096220.94812: worker is 1 (out of 1 available) 15500 1727096220.94822: exiting _queue_task() for managed_node1/setup 15500 1727096220.94833: done queuing things up, now waiting for results queue to drain 15500 1727096220.94835: waiting for pending results... 15500 1727096220.95129: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15500 1727096220.95281: in run() - task 0afff68d-5257-877d-2da0-0000000002f6 15500 1727096220.95300: variable 'ansible_search_path' from source: unknown 15500 1727096220.95307: variable 'ansible_search_path' from source: unknown 15500 1727096220.95343: calling self._execute() 15500 1727096220.95444: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096220.95456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096220.95472: variable 'omit' from source: magic vars 15500 1727096220.95856: variable 'ansible_distribution_major_version' from source: facts 15500 1727096220.95874: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096220.96096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096220.98323: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096220.98346: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096220.98391: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096220.98439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096220.98472: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096220.98560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096220.98595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096220.98624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096220.98679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096220.98699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096220.98866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096220.98871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096220.98874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096220.98877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096220.98879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096220.99041: variable '__network_required_facts' from source: role '' defaults 15500 1727096220.99056: variable 'ansible_facts' from source: unknown 15500 1727096220.99803: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15500 1727096220.99811: when evaluation is False, skipping this task 15500 1727096220.99817: _execute() done 15500 1727096220.99822: dumping result to json 15500 1727096220.99828: done dumping result, returning 15500 1727096220.99846: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-877d-2da0-0000000002f6] 15500 1727096220.99854: sending task result for task 0afff68d-5257-877d-2da0-0000000002f6 15500 1727096221.00176: done sending task result for task 0afff68d-5257-877d-2da0-0000000002f6 15500 1727096221.00180: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096221.00220: no more pending results, returning what we have 15500 1727096221.00224: results queue empty 15500 1727096221.00225: checking for any_errors_fatal 15500 1727096221.00226: done checking for any_errors_fatal 15500 1727096221.00227: checking for max_fail_percentage 15500 1727096221.00229: done checking for max_fail_percentage 15500 1727096221.00229: checking to see if all hosts have failed and the running result is not ok 15500 1727096221.00230: done checking to see if all hosts have failed 15500 1727096221.00231: getting the remaining hosts for this loop 15500 1727096221.00233: done getting the remaining hosts for this loop 15500 1727096221.00237: getting the next task for host managed_node1 15500 1727096221.00246: done getting next task for host managed_node1 15500 1727096221.00250: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15500 1727096221.00253: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096221.00269: getting variables 15500 1727096221.00270: in VariableManager get_vars() 15500 1727096221.00315: Calling all_inventory to load vars for managed_node1 15500 1727096221.00318: Calling groups_inventory to load vars for managed_node1 15500 1727096221.00320: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096221.00329: Calling all_plugins_play to load vars for managed_node1 15500 1727096221.00332: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096221.00335: Calling groups_plugins_play to load vars for managed_node1 15500 1727096221.01819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096221.03515: done with get_vars() 15500 1727096221.03541: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:57:01 -0400 (0:00:00.092) 0:00:21.079 ****** 15500 1727096221.03663: entering _queue_task() for managed_node1/stat 15500 1727096221.04040: worker is 1 (out of 1 available) 15500 1727096221.04052: exiting _queue_task() for managed_node1/stat 15500 1727096221.04063: done queuing things up, now waiting for results queue to drain 15500 1727096221.04064: waiting for pending results... 15500 1727096221.04344: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15500 1727096221.04478: in run() - task 0afff68d-5257-877d-2da0-0000000002f8 15500 1727096221.04498: variable 'ansible_search_path' from source: unknown 15500 1727096221.04505: variable 'ansible_search_path' from source: unknown 15500 1727096221.04546: calling self._execute() 15500 1727096221.04649: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096221.04667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096221.04686: variable 'omit' from source: magic vars 15500 1727096221.05054: variable 'ansible_distribution_major_version' from source: facts 15500 1727096221.05077: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096221.05242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096221.05523: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096221.05585: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096221.05673: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096221.05676: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096221.05821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096221.05856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096221.05897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096221.05933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096221.06042: variable '__network_is_ostree' from source: set_fact 15500 1727096221.06273: Evaluated conditional (not __network_is_ostree is defined): False 15500 1727096221.06278: when evaluation is False, skipping this task 15500 1727096221.06281: _execute() done 15500 1727096221.06284: dumping result to json 15500 1727096221.06286: done dumping result, returning 15500 1727096221.06293: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-877d-2da0-0000000002f8] 15500 1727096221.06295: sending task result for task 0afff68d-5257-877d-2da0-0000000002f8 15500 1727096221.06369: done sending task result for task 0afff68d-5257-877d-2da0-0000000002f8 15500 1727096221.06372: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15500 1727096221.06443: no more pending results, returning what we have 15500 1727096221.06446: results queue empty 15500 1727096221.06446: checking for any_errors_fatal 15500 1727096221.06452: done checking for any_errors_fatal 15500 1727096221.06452: checking for max_fail_percentage 15500 1727096221.06454: done checking for max_fail_percentage 15500 1727096221.06455: checking to see if all hosts have failed and the running result is not ok 15500 1727096221.06456: done checking to see if all hosts have failed 15500 1727096221.06456: getting the remaining hosts for this loop 15500 1727096221.06457: done getting the remaining hosts for this loop 15500 1727096221.06461: getting the next task for host managed_node1 15500 1727096221.06466: done getting next task for host managed_node1 15500 1727096221.06471: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15500 1727096221.06474: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096221.06487: getting variables 15500 1727096221.06488: in VariableManager get_vars() 15500 1727096221.06522: Calling all_inventory to load vars for managed_node1 15500 1727096221.06524: Calling groups_inventory to load vars for managed_node1 15500 1727096221.06526: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096221.06534: Calling all_plugins_play to load vars for managed_node1 15500 1727096221.06537: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096221.06539: Calling groups_plugins_play to load vars for managed_node1 15500 1727096221.08097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096221.09686: done with get_vars() 15500 1727096221.09722: done getting variables 15500 1727096221.09779: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:57:01 -0400 (0:00:00.061) 0:00:21.141 ****** 15500 1727096221.09811: entering _queue_task() for managed_node1/set_fact 15500 1727096221.10393: worker is 1 (out of 1 available) 15500 1727096221.10404: exiting _queue_task() for managed_node1/set_fact 15500 1727096221.10415: done queuing things up, now waiting for results queue to drain 15500 1727096221.10416: waiting for pending results... 15500 1727096221.10502: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15500 1727096221.10656: in run() - task 0afff68d-5257-877d-2da0-0000000002f9 15500 1727096221.10681: variable 'ansible_search_path' from source: unknown 15500 1727096221.10689: variable 'ansible_search_path' from source: unknown 15500 1727096221.10730: calling self._execute() 15500 1727096221.10833: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096221.10847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096221.10873: variable 'omit' from source: magic vars 15500 1727096221.11261: variable 'ansible_distribution_major_version' from source: facts 15500 1727096221.11280: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096221.11463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096221.11752: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096221.11805: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096221.11853: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096221.11894: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096221.12029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096221.12163: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096221.12169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096221.12172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096221.12227: variable '__network_is_ostree' from source: set_fact 15500 1727096221.12242: Evaluated conditional (not __network_is_ostree is defined): False 15500 1727096221.12250: when evaluation is False, skipping this task 15500 1727096221.12258: _execute() done 15500 1727096221.12272: dumping result to json 15500 1727096221.12282: done dumping result, returning 15500 1727096221.12295: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-877d-2da0-0000000002f9] 15500 1727096221.12304: sending task result for task 0afff68d-5257-877d-2da0-0000000002f9 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15500 1727096221.12534: no more pending results, returning what we have 15500 1727096221.12538: results queue empty 15500 1727096221.12539: checking for any_errors_fatal 15500 1727096221.12547: done checking for any_errors_fatal 15500 1727096221.12548: checking for max_fail_percentage 15500 1727096221.12550: done checking for max_fail_percentage 15500 1727096221.12551: checking to see if all hosts have failed and the running result is not ok 15500 1727096221.12552: done checking to see if all hosts have failed 15500 1727096221.12553: getting the remaining hosts for this loop 15500 1727096221.12555: done getting the remaining hosts for this loop 15500 1727096221.12559: getting the next task for host managed_node1 15500 1727096221.12570: done getting next task for host managed_node1 15500 1727096221.12575: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15500 1727096221.12578: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096221.12595: getting variables 15500 1727096221.12598: in VariableManager get_vars() 15500 1727096221.12642: Calling all_inventory to load vars for managed_node1 15500 1727096221.12645: Calling groups_inventory to load vars for managed_node1 15500 1727096221.12648: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096221.12659: Calling all_plugins_play to load vars for managed_node1 15500 1727096221.12662: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096221.12665: Calling groups_plugins_play to load vars for managed_node1 15500 1727096221.12786: done sending task result for task 0afff68d-5257-877d-2da0-0000000002f9 15500 1727096221.12790: WORKER PROCESS EXITING 15500 1727096221.14339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096221.16011: done with get_vars() 15500 1727096221.16048: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:57:01 -0400 (0:00:00.063) 0:00:21.204 ****** 15500 1727096221.16155: entering _queue_task() for managed_node1/service_facts 15500 1727096221.16528: worker is 1 (out of 1 available) 15500 1727096221.16541: exiting _queue_task() for managed_node1/service_facts 15500 1727096221.16554: done queuing things up, now waiting for results queue to drain 15500 1727096221.16555: waiting for pending results... 15500 1727096221.16848: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15500 1727096221.17005: in run() - task 0afff68d-5257-877d-2da0-0000000002fb 15500 1727096221.17026: variable 'ansible_search_path' from source: unknown 15500 1727096221.17035: variable 'ansible_search_path' from source: unknown 15500 1727096221.17078: calling self._execute() 15500 1727096221.17181: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096221.17210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096221.17214: variable 'omit' from source: magic vars 15500 1727096221.17647: variable 'ansible_distribution_major_version' from source: facts 15500 1727096221.17650: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096221.17653: variable 'omit' from source: magic vars 15500 1727096221.17690: variable 'omit' from source: magic vars 15500 1727096221.17728: variable 'omit' from source: magic vars 15500 1727096221.17779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096221.17818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096221.17846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096221.17878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096221.17974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096221.17978: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096221.17980: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096221.17983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096221.18051: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096221.18062: Set connection var ansible_pipelining to False 15500 1727096221.18078: Set connection var ansible_timeout to 10 15500 1727096221.18172: Set connection var ansible_shell_type to sh 15500 1727096221.18175: Set connection var ansible_shell_executable to /bin/sh 15500 1727096221.18179: Set connection var ansible_connection to ssh 15500 1727096221.18181: variable 'ansible_shell_executable' from source: unknown 15500 1727096221.18183: variable 'ansible_connection' from source: unknown 15500 1727096221.18186: variable 'ansible_module_compression' from source: unknown 15500 1727096221.18192: variable 'ansible_shell_type' from source: unknown 15500 1727096221.18194: variable 'ansible_shell_executable' from source: unknown 15500 1727096221.18196: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096221.18198: variable 'ansible_pipelining' from source: unknown 15500 1727096221.18199: variable 'ansible_timeout' from source: unknown 15500 1727096221.18201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096221.18384: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096221.18411: variable 'omit' from source: magic vars 15500 1727096221.18414: starting attempt loop 15500 1727096221.18520: running the handler 15500 1727096221.18523: _low_level_execute_command(): starting 15500 1727096221.18527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096221.19299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096221.19324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096221.19345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096221.19394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096221.19515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096221.21241: stdout chunk (state=3): >>>/root <<< 15500 1727096221.21373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096221.21377: stderr chunk (state=3): >>><<< 15500 1727096221.21379: stdout chunk (state=3): >>><<< 15500 1727096221.21400: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096221.21413: _low_level_execute_command(): starting 15500 1727096221.21420: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680 `" && echo ansible-tmp-1727096221.2140112-16386-61632374121680="` echo /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680 `" ) && sleep 0' 15500 1727096221.22064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096221.22101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096221.22117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096221.22216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096221.24186: stdout chunk (state=3): >>>ansible-tmp-1727096221.2140112-16386-61632374121680=/root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680 <<< 15500 1727096221.24289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096221.24319: stderr chunk (state=3): >>><<< 15500 1727096221.24323: stdout chunk (state=3): >>><<< 15500 1727096221.24342: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096221.2140112-16386-61632374121680=/root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096221.24387: variable 'ansible_module_compression' from source: unknown 15500 1727096221.24424: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15500 1727096221.24463: variable 'ansible_facts' from source: unknown 15500 1727096221.24538: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/AnsiballZ_service_facts.py 15500 1727096221.24898: Sending initial data 15500 1727096221.24902: Sent initial data (161 bytes) 15500 1727096221.25281: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096221.25292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096221.25303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096221.25392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096221.25419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096221.25431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096221.25451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096221.25541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096221.27224: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096221.27288: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096221.27394: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpgbe6q4xn /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/AnsiballZ_service_facts.py <<< 15500 1727096221.27398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/AnsiballZ_service_facts.py" <<< 15500 1727096221.27464: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpgbe6q4xn" to remote "/root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/AnsiballZ_service_facts.py" <<< 15500 1727096221.28395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096221.28446: stderr chunk (state=3): >>><<< 15500 1727096221.28502: stdout chunk (state=3): >>><<< 15500 1727096221.28505: done transferring module to remote 15500 1727096221.28523: _low_level_execute_command(): starting 15500 1727096221.28532: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/ /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/AnsiballZ_service_facts.py && sleep 0' 15500 1727096221.29230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096221.29280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096221.29283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096221.29388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096221.29415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096221.29524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096221.31473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096221.31483: stdout chunk (state=3): >>><<< 15500 1727096221.31486: stderr chunk (state=3): >>><<< 15500 1727096221.31593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096221.31599: _low_level_execute_command(): starting 15500 1727096221.31603: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/AnsiballZ_service_facts.py && sleep 0' 15500 1727096221.32287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096221.32334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096221.32359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096221.32387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096221.32520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096222.96513: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 15500 1727096222.96544: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15500 1727096222.98218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096222.98222: stdout chunk (state=3): >>><<< 15500 1727096222.98225: stderr chunk (state=3): >>><<< 15500 1727096222.98258: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096222.99430: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096222.99435: _low_level_execute_command(): starting 15500 1727096222.99438: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096221.2140112-16386-61632374121680/ > /dev/null 2>&1 && sleep 0' 15500 1727096223.00177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096223.00183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096223.00186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096223.00189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096223.00258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096223.00272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096223.00418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096223.02575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096223.02579: stderr chunk (state=3): >>><<< 15500 1727096223.02582: stdout chunk (state=3): >>><<< 15500 1727096223.02584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096223.02587: handler run complete 15500 1727096223.02663: variable 'ansible_facts' from source: unknown 15500 1727096223.02830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096223.03331: variable 'ansible_facts' from source: unknown 15500 1727096223.03473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096223.03695: attempt loop complete, returning result 15500 1727096223.03706: _execute() done 15500 1727096223.03713: dumping result to json 15500 1727096223.03776: done dumping result, returning 15500 1727096223.03795: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-877d-2da0-0000000002fb] 15500 1727096223.03806: sending task result for task 0afff68d-5257-877d-2da0-0000000002fb ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096223.04736: no more pending results, returning what we have 15500 1727096223.04739: results queue empty 15500 1727096223.04739: checking for any_errors_fatal 15500 1727096223.04743: done checking for any_errors_fatal 15500 1727096223.04743: checking for max_fail_percentage 15500 1727096223.04745: done checking for max_fail_percentage 15500 1727096223.04745: checking to see if all hosts have failed and the running result is not ok 15500 1727096223.04746: done checking to see if all hosts have failed 15500 1727096223.04747: getting the remaining hosts for this loop 15500 1727096223.04748: done getting the remaining hosts for this loop 15500 1727096223.04751: getting the next task for host managed_node1 15500 1727096223.04756: done getting next task for host managed_node1 15500 1727096223.04762: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15500 1727096223.04764: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096223.04775: done sending task result for task 0afff68d-5257-877d-2da0-0000000002fb 15500 1727096223.04779: WORKER PROCESS EXITING 15500 1727096223.04784: getting variables 15500 1727096223.04785: in VariableManager get_vars() 15500 1727096223.04808: Calling all_inventory to load vars for managed_node1 15500 1727096223.04810: Calling groups_inventory to load vars for managed_node1 15500 1727096223.04812: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096223.04818: Calling all_plugins_play to load vars for managed_node1 15500 1727096223.04820: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096223.04822: Calling groups_plugins_play to load vars for managed_node1 15500 1727096223.05792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096223.07650: done with get_vars() 15500 1727096223.07687: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:57:03 -0400 (0:00:01.916) 0:00:23.121 ****** 15500 1727096223.07797: entering _queue_task() for managed_node1/package_facts 15500 1727096223.08235: worker is 1 (out of 1 available) 15500 1727096223.08248: exiting _queue_task() for managed_node1/package_facts 15500 1727096223.08262: done queuing things up, now waiting for results queue to drain 15500 1727096223.08263: waiting for pending results... 15500 1727096223.08685: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15500 1727096223.08695: in run() - task 0afff68d-5257-877d-2da0-0000000002fc 15500 1727096223.08714: variable 'ansible_search_path' from source: unknown 15500 1727096223.08720: variable 'ansible_search_path' from source: unknown 15500 1727096223.08761: calling self._execute() 15500 1727096223.08866: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096223.08882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096223.08898: variable 'omit' from source: magic vars 15500 1727096223.09531: variable 'ansible_distribution_major_version' from source: facts 15500 1727096223.09547: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096223.09581: variable 'omit' from source: magic vars 15500 1727096223.09776: variable 'omit' from source: magic vars 15500 1727096223.09792: variable 'omit' from source: magic vars 15500 1727096223.09837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096223.09992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096223.09995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096223.10000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096223.10002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096223.10128: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096223.10179: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096223.10188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096223.10475: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096223.10478: Set connection var ansible_pipelining to False 15500 1727096223.10481: Set connection var ansible_timeout to 10 15500 1727096223.10483: Set connection var ansible_shell_type to sh 15500 1727096223.10486: Set connection var ansible_shell_executable to /bin/sh 15500 1727096223.10488: Set connection var ansible_connection to ssh 15500 1727096223.10490: variable 'ansible_shell_executable' from source: unknown 15500 1727096223.10492: variable 'ansible_connection' from source: unknown 15500 1727096223.10495: variable 'ansible_module_compression' from source: unknown 15500 1727096223.10497: variable 'ansible_shell_type' from source: unknown 15500 1727096223.10499: variable 'ansible_shell_executable' from source: unknown 15500 1727096223.10501: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096223.10503: variable 'ansible_pipelining' from source: unknown 15500 1727096223.10505: variable 'ansible_timeout' from source: unknown 15500 1727096223.10507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096223.10766: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096223.10786: variable 'omit' from source: magic vars 15500 1727096223.10800: starting attempt loop 15500 1727096223.10808: running the handler 15500 1727096223.10826: _low_level_execute_command(): starting 15500 1727096223.10838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096223.11680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096223.11706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096223.11811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096223.13725: stdout chunk (state=3): >>>/root <<< 15500 1727096223.14052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096223.14072: stdout chunk (state=3): >>><<< 15500 1727096223.14404: stderr chunk (state=3): >>><<< 15500 1727096223.14408: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096223.14411: _low_level_execute_command(): starting 15500 1727096223.14414: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246 `" && echo ansible-tmp-1727096223.1431205-16444-154418592111246="` echo /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246 `" ) && sleep 0' 15500 1727096223.15166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096223.15185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096223.15198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096223.15217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096223.15234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096223.15327: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096223.15358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096223.15384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096223.15491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096223.17502: stdout chunk (state=3): >>>ansible-tmp-1727096223.1431205-16444-154418592111246=/root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246 <<< 15500 1727096223.17672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096223.17685: stdout chunk (state=3): >>><<< 15500 1727096223.17697: stderr chunk (state=3): >>><<< 15500 1727096223.17720: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096223.1431205-16444-154418592111246=/root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096223.18175: variable 'ansible_module_compression' from source: unknown 15500 1727096223.18179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15500 1727096223.18182: variable 'ansible_facts' from source: unknown 15500 1727096223.18348: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/AnsiballZ_package_facts.py 15500 1727096223.19029: Sending initial data 15500 1727096223.19034: Sent initial data (162 bytes) 15500 1727096223.20190: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096223.20364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096223.20381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096223.20584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096223.22217: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096223.22311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096223.22398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp7c8d1syx /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/AnsiballZ_package_facts.py <<< 15500 1727096223.22402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/AnsiballZ_package_facts.py" <<< 15500 1727096223.22600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp7c8d1syx" to remote "/root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/AnsiballZ_package_facts.py" <<< 15500 1727096223.26006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096223.26027: stderr chunk (state=3): >>><<< 15500 1727096223.26036: stdout chunk (state=3): >>><<< 15500 1727096223.26095: done transferring module to remote 15500 1727096223.26111: _low_level_execute_command(): starting 15500 1727096223.26137: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/ /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/AnsiballZ_package_facts.py && sleep 0' 15500 1727096223.27489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096223.27688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096223.27709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096223.27724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096223.28064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096223.28127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096223.30115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096223.30119: stdout chunk (state=3): >>><<< 15500 1727096223.30122: stderr chunk (state=3): >>><<< 15500 1727096223.30137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096223.30151: _low_level_execute_command(): starting 15500 1727096223.30353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/AnsiballZ_package_facts.py && sleep 0' 15500 1727096223.31772: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096223.31789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096223.31806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096223.31921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096223.76418: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 15500 1727096223.76550: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 15500 1727096223.76846: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15500 1727096223.78479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096223.78485: stdout chunk (state=3): >>><<< 15500 1727096223.78487: stderr chunk (state=3): >>><<< 15500 1727096223.78639: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096223.83458: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096223.83617: _low_level_execute_command(): starting 15500 1727096223.83621: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096223.1431205-16444-154418592111246/ > /dev/null 2>&1 && sleep 0' 15500 1727096223.84796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096223.84887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096223.85083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096223.85096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096223.85181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096223.87104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096223.87151: stderr chunk (state=3): >>><<< 15500 1727096223.87154: stdout chunk (state=3): >>><<< 15500 1727096223.87177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096223.87189: handler run complete 15500 1727096223.88836: variable 'ansible_facts' from source: unknown 15500 1727096223.90074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096223.93883: variable 'ansible_facts' from source: unknown 15500 1727096223.94774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096223.96159: attempt loop complete, returning result 15500 1727096223.96382: _execute() done 15500 1727096223.96386: dumping result to json 15500 1727096223.96724: done dumping result, returning 15500 1727096223.96742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-877d-2da0-0000000002fc] 15500 1727096223.96925: sending task result for task 0afff68d-5257-877d-2da0-0000000002fc 15500 1727096224.01430: done sending task result for task 0afff68d-5257-877d-2da0-0000000002fc 15500 1727096224.01435: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096224.01587: no more pending results, returning what we have 15500 1727096224.01591: results queue empty 15500 1727096224.01592: checking for any_errors_fatal 15500 1727096224.01597: done checking for any_errors_fatal 15500 1727096224.01598: checking for max_fail_percentage 15500 1727096224.01600: done checking for max_fail_percentage 15500 1727096224.01600: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.01602: done checking to see if all hosts have failed 15500 1727096224.01602: getting the remaining hosts for this loop 15500 1727096224.01603: done getting the remaining hosts for this loop 15500 1727096224.01607: getting the next task for host managed_node1 15500 1727096224.01614: done getting next task for host managed_node1 15500 1727096224.01617: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15500 1727096224.01619: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.01629: getting variables 15500 1727096224.01631: in VariableManager get_vars() 15500 1727096224.01666: Calling all_inventory to load vars for managed_node1 15500 1727096224.01873: Calling groups_inventory to load vars for managed_node1 15500 1727096224.01877: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.01886: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.01889: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.01892: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.04285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.07624: done with get_vars() 15500 1727096224.07660: done getting variables 15500 1727096224.07927: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:57:04 -0400 (0:00:01.001) 0:00:24.122 ****** 15500 1727096224.07961: entering _queue_task() for managed_node1/debug 15500 1727096224.08712: worker is 1 (out of 1 available) 15500 1727096224.08725: exiting _queue_task() for managed_node1/debug 15500 1727096224.08737: done queuing things up, now waiting for results queue to drain 15500 1727096224.08739: waiting for pending results... 15500 1727096224.09187: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15500 1727096224.09194: in run() - task 0afff68d-5257-877d-2da0-00000000003b 15500 1727096224.09574: variable 'ansible_search_path' from source: unknown 15500 1727096224.09578: variable 'ansible_search_path' from source: unknown 15500 1727096224.09581: calling self._execute() 15500 1727096224.09583: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.09586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.09588: variable 'omit' from source: magic vars 15500 1727096224.10574: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.10577: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.10580: variable 'omit' from source: magic vars 15500 1727096224.10582: variable 'omit' from source: magic vars 15500 1727096224.10672: variable 'network_provider' from source: set_fact 15500 1727096224.10972: variable 'omit' from source: magic vars 15500 1727096224.10976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096224.10979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096224.11005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096224.11027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096224.11045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096224.11081: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096224.11373: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.11376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.11398: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096224.11409: Set connection var ansible_pipelining to False 15500 1727096224.11419: Set connection var ansible_timeout to 10 15500 1727096224.11424: Set connection var ansible_shell_type to sh 15500 1727096224.11432: Set connection var ansible_shell_executable to /bin/sh 15500 1727096224.11442: Set connection var ansible_connection to ssh 15500 1727096224.11470: variable 'ansible_shell_executable' from source: unknown 15500 1727096224.11772: variable 'ansible_connection' from source: unknown 15500 1727096224.11776: variable 'ansible_module_compression' from source: unknown 15500 1727096224.11778: variable 'ansible_shell_type' from source: unknown 15500 1727096224.11780: variable 'ansible_shell_executable' from source: unknown 15500 1727096224.11782: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.11784: variable 'ansible_pipelining' from source: unknown 15500 1727096224.11786: variable 'ansible_timeout' from source: unknown 15500 1727096224.11788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.11862: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096224.11880: variable 'omit' from source: magic vars 15500 1727096224.11890: starting attempt loop 15500 1727096224.11896: running the handler 15500 1727096224.11945: handler run complete 15500 1727096224.12272: attempt loop complete, returning result 15500 1727096224.12275: _execute() done 15500 1727096224.12278: dumping result to json 15500 1727096224.12280: done dumping result, returning 15500 1727096224.12282: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-877d-2da0-00000000003b] 15500 1727096224.12284: sending task result for task 0afff68d-5257-877d-2da0-00000000003b 15500 1727096224.12351: done sending task result for task 0afff68d-5257-877d-2da0-00000000003b 15500 1727096224.12354: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 15500 1727096224.12424: no more pending results, returning what we have 15500 1727096224.12428: results queue empty 15500 1727096224.12429: checking for any_errors_fatal 15500 1727096224.12439: done checking for any_errors_fatal 15500 1727096224.12439: checking for max_fail_percentage 15500 1727096224.12441: done checking for max_fail_percentage 15500 1727096224.12441: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.12442: done checking to see if all hosts have failed 15500 1727096224.12443: getting the remaining hosts for this loop 15500 1727096224.12444: done getting the remaining hosts for this loop 15500 1727096224.12448: getting the next task for host managed_node1 15500 1727096224.12454: done getting next task for host managed_node1 15500 1727096224.12460: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15500 1727096224.12462: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.12475: getting variables 15500 1727096224.12476: in VariableManager get_vars() 15500 1727096224.12512: Calling all_inventory to load vars for managed_node1 15500 1727096224.12515: Calling groups_inventory to load vars for managed_node1 15500 1727096224.12517: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.12526: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.12529: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.12531: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.15427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.18681: done with get_vars() 15500 1727096224.18714: done getting variables 15500 1727096224.18989: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:57:04 -0400 (0:00:00.110) 0:00:24.233 ****** 15500 1727096224.19024: entering _queue_task() for managed_node1/fail 15500 1727096224.20003: worker is 1 (out of 1 available) 15500 1727096224.20014: exiting _queue_task() for managed_node1/fail 15500 1727096224.20024: done queuing things up, now waiting for results queue to drain 15500 1727096224.20025: waiting for pending results... 15500 1727096224.20241: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15500 1727096224.20335: in run() - task 0afff68d-5257-877d-2da0-00000000003c 15500 1727096224.20348: variable 'ansible_search_path' from source: unknown 15500 1727096224.20352: variable 'ansible_search_path' from source: unknown 15500 1727096224.20615: calling self._execute() 15500 1727096224.20827: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.20831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.20842: variable 'omit' from source: magic vars 15500 1727096224.21424: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.21441: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.21604: variable 'network_state' from source: role '' defaults 15500 1727096224.21620: Evaluated conditional (network_state != {}): False 15500 1727096224.21626: when evaluation is False, skipping this task 15500 1727096224.21633: _execute() done 15500 1727096224.21639: dumping result to json 15500 1727096224.21646: done dumping result, returning 15500 1727096224.21711: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-877d-2da0-00000000003c] 15500 1727096224.21722: sending task result for task 0afff68d-5257-877d-2da0-00000000003c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096224.22074: no more pending results, returning what we have 15500 1727096224.22078: results queue empty 15500 1727096224.22079: checking for any_errors_fatal 15500 1727096224.22088: done checking for any_errors_fatal 15500 1727096224.22089: checking for max_fail_percentage 15500 1727096224.22090: done checking for max_fail_percentage 15500 1727096224.22091: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.22092: done checking to see if all hosts have failed 15500 1727096224.22093: getting the remaining hosts for this loop 15500 1727096224.22094: done getting the remaining hosts for this loop 15500 1727096224.22098: getting the next task for host managed_node1 15500 1727096224.22105: done getting next task for host managed_node1 15500 1727096224.22109: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15500 1727096224.22112: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.22126: getting variables 15500 1727096224.22129: in VariableManager get_vars() 15500 1727096224.22172: Calling all_inventory to load vars for managed_node1 15500 1727096224.22175: Calling groups_inventory to load vars for managed_node1 15500 1727096224.22177: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.22188: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.22191: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.22194: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.22802: done sending task result for task 0afff68d-5257-877d-2da0-00000000003c 15500 1727096224.22805: WORKER PROCESS EXITING 15500 1727096224.24545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.26694: done with get_vars() 15500 1727096224.26727: done getting variables 15500 1727096224.26803: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:57:04 -0400 (0:00:00.078) 0:00:24.311 ****** 15500 1727096224.26834: entering _queue_task() for managed_node1/fail 15500 1727096224.27198: worker is 1 (out of 1 available) 15500 1727096224.27369: exiting _queue_task() for managed_node1/fail 15500 1727096224.27380: done queuing things up, now waiting for results queue to drain 15500 1727096224.27382: waiting for pending results... 15500 1727096224.27525: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15500 1727096224.27648: in run() - task 0afff68d-5257-877d-2da0-00000000003d 15500 1727096224.27673: variable 'ansible_search_path' from source: unknown 15500 1727096224.27682: variable 'ansible_search_path' from source: unknown 15500 1727096224.27736: calling self._execute() 15500 1727096224.27847: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.27862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.27881: variable 'omit' from source: magic vars 15500 1727096224.28300: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.28317: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.28452: variable 'network_state' from source: role '' defaults 15500 1727096224.28484: Evaluated conditional (network_state != {}): False 15500 1727096224.28487: when evaluation is False, skipping this task 15500 1727096224.28574: _execute() done 15500 1727096224.28577: dumping result to json 15500 1727096224.28579: done dumping result, returning 15500 1727096224.28582: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-877d-2da0-00000000003d] 15500 1727096224.28587: sending task result for task 0afff68d-5257-877d-2da0-00000000003d 15500 1727096224.28659: done sending task result for task 0afff68d-5257-877d-2da0-00000000003d 15500 1727096224.28663: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096224.28714: no more pending results, returning what we have 15500 1727096224.28718: results queue empty 15500 1727096224.28719: checking for any_errors_fatal 15500 1727096224.28727: done checking for any_errors_fatal 15500 1727096224.28728: checking for max_fail_percentage 15500 1727096224.28730: done checking for max_fail_percentage 15500 1727096224.28730: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.28731: done checking to see if all hosts have failed 15500 1727096224.28732: getting the remaining hosts for this loop 15500 1727096224.28734: done getting the remaining hosts for this loop 15500 1727096224.28737: getting the next task for host managed_node1 15500 1727096224.28744: done getting next task for host managed_node1 15500 1727096224.28748: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15500 1727096224.28751: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.28972: getting variables 15500 1727096224.28974: in VariableManager get_vars() 15500 1727096224.29007: Calling all_inventory to load vars for managed_node1 15500 1727096224.29009: Calling groups_inventory to load vars for managed_node1 15500 1727096224.29012: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.29020: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.29022: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.29025: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.30543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.32196: done with get_vars() 15500 1727096224.32224: done getting variables 15500 1727096224.32295: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:57:04 -0400 (0:00:00.054) 0:00:24.366 ****** 15500 1727096224.32327: entering _queue_task() for managed_node1/fail 15500 1727096224.32791: worker is 1 (out of 1 available) 15500 1727096224.32803: exiting _queue_task() for managed_node1/fail 15500 1727096224.32814: done queuing things up, now waiting for results queue to drain 15500 1727096224.32815: waiting for pending results... 15500 1727096224.33166: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15500 1727096224.33173: in run() - task 0afff68d-5257-877d-2da0-00000000003e 15500 1727096224.33177: variable 'ansible_search_path' from source: unknown 15500 1727096224.33179: variable 'ansible_search_path' from source: unknown 15500 1727096224.33207: calling self._execute() 15500 1727096224.33314: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.33327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.33343: variable 'omit' from source: magic vars 15500 1727096224.33738: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.33755: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.33975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096224.36320: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096224.36396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096224.36573: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096224.36576: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096224.36579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096224.36594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.36625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.36654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.36704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.36722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.36829: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.36848: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15500 1727096224.36975: variable 'ansible_distribution' from source: facts 15500 1727096224.36987: variable '__network_rh_distros' from source: role '' defaults 15500 1727096224.37002: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15500 1727096224.37266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.37297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.37326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.37376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.37396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.37453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.37483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.37562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.37566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.37574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.37620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.37649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.37684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.37725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.37745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.38091: variable 'network_connections' from source: play vars 15500 1727096224.38110: variable 'profile' from source: play vars 15500 1727096224.38214: variable 'profile' from source: play vars 15500 1727096224.38217: variable 'interface' from source: set_fact 15500 1727096224.38248: variable 'interface' from source: set_fact 15500 1727096224.38262: variable 'network_state' from source: role '' defaults 15500 1727096224.38332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096224.38505: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096224.38546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096224.38580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096224.38612: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096224.38757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096224.38768: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096224.38771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.38773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096224.38776: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15500 1727096224.38778: when evaluation is False, skipping this task 15500 1727096224.38780: _execute() done 15500 1727096224.38782: dumping result to json 15500 1727096224.38791: done dumping result, returning 15500 1727096224.38803: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-877d-2da0-00000000003e] 15500 1727096224.38811: sending task result for task 0afff68d-5257-877d-2da0-00000000003e 15500 1727096224.39078: done sending task result for task 0afff68d-5257-877d-2da0-00000000003e 15500 1727096224.39081: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15500 1727096224.39132: no more pending results, returning what we have 15500 1727096224.39136: results queue empty 15500 1727096224.39137: checking for any_errors_fatal 15500 1727096224.39145: done checking for any_errors_fatal 15500 1727096224.39146: checking for max_fail_percentage 15500 1727096224.39148: done checking for max_fail_percentage 15500 1727096224.39149: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.39150: done checking to see if all hosts have failed 15500 1727096224.39151: getting the remaining hosts for this loop 15500 1727096224.39152: done getting the remaining hosts for this loop 15500 1727096224.39157: getting the next task for host managed_node1 15500 1727096224.39164: done getting next task for host managed_node1 15500 1727096224.39171: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15500 1727096224.39173: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.39186: getting variables 15500 1727096224.39188: in VariableManager get_vars() 15500 1727096224.39225: Calling all_inventory to load vars for managed_node1 15500 1727096224.39227: Calling groups_inventory to load vars for managed_node1 15500 1727096224.39230: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.39239: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.39241: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.39244: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.40729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.42357: done with get_vars() 15500 1727096224.42383: done getting variables 15500 1727096224.42443: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:57:04 -0400 (0:00:00.101) 0:00:24.467 ****** 15500 1727096224.42476: entering _queue_task() for managed_node1/dnf 15500 1727096224.42831: worker is 1 (out of 1 available) 15500 1727096224.42845: exiting _queue_task() for managed_node1/dnf 15500 1727096224.42857: done queuing things up, now waiting for results queue to drain 15500 1727096224.42859: waiting for pending results... 15500 1727096224.43146: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15500 1727096224.43251: in run() - task 0afff68d-5257-877d-2da0-00000000003f 15500 1727096224.43272: variable 'ansible_search_path' from source: unknown 15500 1727096224.43280: variable 'ansible_search_path' from source: unknown 15500 1727096224.43326: calling self._execute() 15500 1727096224.43424: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.43437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.43451: variable 'omit' from source: magic vars 15500 1727096224.43814: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.43832: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.44023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096224.46324: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096224.46401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096224.46447: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096224.46488: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096224.46524: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096224.46608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.46646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.46678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.46737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.46842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.46878: variable 'ansible_distribution' from source: facts 15500 1727096224.46889: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.46912: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15500 1727096224.47035: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096224.47167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.47201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.47229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.47283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.47297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.47373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.47376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.47402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.47446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.47466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.47517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.47573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.47576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.47633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.47655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.47763: variable 'network_connections' from source: play vars 15500 1727096224.47772: variable 'profile' from source: play vars 15500 1727096224.47824: variable 'profile' from source: play vars 15500 1727096224.47827: variable 'interface' from source: set_fact 15500 1727096224.47871: variable 'interface' from source: set_fact 15500 1727096224.47921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096224.48052: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096224.48082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096224.48105: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096224.48127: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096224.48162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096224.48177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096224.48198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.48217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096224.48253: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096224.48403: variable 'network_connections' from source: play vars 15500 1727096224.48407: variable 'profile' from source: play vars 15500 1727096224.48450: variable 'profile' from source: play vars 15500 1727096224.48453: variable 'interface' from source: set_fact 15500 1727096224.48497: variable 'interface' from source: set_fact 15500 1727096224.48516: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096224.48519: when evaluation is False, skipping this task 15500 1727096224.48521: _execute() done 15500 1727096224.48524: dumping result to json 15500 1727096224.48528: done dumping result, returning 15500 1727096224.48535: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-877d-2da0-00000000003f] 15500 1727096224.48539: sending task result for task 0afff68d-5257-877d-2da0-00000000003f 15500 1727096224.48632: done sending task result for task 0afff68d-5257-877d-2da0-00000000003f 15500 1727096224.48636: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096224.48687: no more pending results, returning what we have 15500 1727096224.48690: results queue empty 15500 1727096224.48691: checking for any_errors_fatal 15500 1727096224.48695: done checking for any_errors_fatal 15500 1727096224.48697: checking for max_fail_percentage 15500 1727096224.48698: done checking for max_fail_percentage 15500 1727096224.48699: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.48700: done checking to see if all hosts have failed 15500 1727096224.48700: getting the remaining hosts for this loop 15500 1727096224.48702: done getting the remaining hosts for this loop 15500 1727096224.48705: getting the next task for host managed_node1 15500 1727096224.48711: done getting next task for host managed_node1 15500 1727096224.48715: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15500 1727096224.48716: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.48729: getting variables 15500 1727096224.48731: in VariableManager get_vars() 15500 1727096224.48770: Calling all_inventory to load vars for managed_node1 15500 1727096224.48773: Calling groups_inventory to load vars for managed_node1 15500 1727096224.48776: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.48784: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.48787: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.48789: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.49866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.51049: done with get_vars() 15500 1727096224.51076: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15500 1727096224.51133: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:57:04 -0400 (0:00:00.086) 0:00:24.554 ****** 15500 1727096224.51155: entering _queue_task() for managed_node1/yum 15500 1727096224.51413: worker is 1 (out of 1 available) 15500 1727096224.51429: exiting _queue_task() for managed_node1/yum 15500 1727096224.51440: done queuing things up, now waiting for results queue to drain 15500 1727096224.51442: waiting for pending results... 15500 1727096224.51615: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15500 1727096224.51691: in run() - task 0afff68d-5257-877d-2da0-000000000040 15500 1727096224.51702: variable 'ansible_search_path' from source: unknown 15500 1727096224.51706: variable 'ansible_search_path' from source: unknown 15500 1727096224.51736: calling self._execute() 15500 1727096224.51811: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.51815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.51824: variable 'omit' from source: magic vars 15500 1727096224.52106: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.52115: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.52241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096224.54407: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096224.54454: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096224.54486: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096224.54511: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096224.54536: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096224.54603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.54622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.54645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.54674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.54686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.54755: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.54771: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15500 1727096224.54775: when evaluation is False, skipping this task 15500 1727096224.54777: _execute() done 15500 1727096224.54780: dumping result to json 15500 1727096224.54782: done dumping result, returning 15500 1727096224.54791: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-877d-2da0-000000000040] 15500 1727096224.54794: sending task result for task 0afff68d-5257-877d-2da0-000000000040 15500 1727096224.54886: done sending task result for task 0afff68d-5257-877d-2da0-000000000040 15500 1727096224.54888: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15500 1727096224.54936: no more pending results, returning what we have 15500 1727096224.54939: results queue empty 15500 1727096224.54940: checking for any_errors_fatal 15500 1727096224.54946: done checking for any_errors_fatal 15500 1727096224.54947: checking for max_fail_percentage 15500 1727096224.54949: done checking for max_fail_percentage 15500 1727096224.54949: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.54950: done checking to see if all hosts have failed 15500 1727096224.54951: getting the remaining hosts for this loop 15500 1727096224.54953: done getting the remaining hosts for this loop 15500 1727096224.54956: getting the next task for host managed_node1 15500 1727096224.54965: done getting next task for host managed_node1 15500 1727096224.54970: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15500 1727096224.54972: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.54986: getting variables 15500 1727096224.54988: in VariableManager get_vars() 15500 1727096224.55026: Calling all_inventory to load vars for managed_node1 15500 1727096224.55029: Calling groups_inventory to load vars for managed_node1 15500 1727096224.55031: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.55040: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.55043: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.55045: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.56203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.57727: done with get_vars() 15500 1727096224.57761: done getting variables 15500 1727096224.57824: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:57:04 -0400 (0:00:00.066) 0:00:24.621 ****** 15500 1727096224.57860: entering _queue_task() for managed_node1/fail 15500 1727096224.58230: worker is 1 (out of 1 available) 15500 1727096224.58244: exiting _queue_task() for managed_node1/fail 15500 1727096224.58259: done queuing things up, now waiting for results queue to drain 15500 1727096224.58261: waiting for pending results... 15500 1727096224.58688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15500 1727096224.58702: in run() - task 0afff68d-5257-877d-2da0-000000000041 15500 1727096224.58731: variable 'ansible_search_path' from source: unknown 15500 1727096224.58738: variable 'ansible_search_path' from source: unknown 15500 1727096224.58787: calling self._execute() 15500 1727096224.58891: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.58907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.58931: variable 'omit' from source: magic vars 15500 1727096224.59319: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.59329: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.59418: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096224.59573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096224.61738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096224.61790: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096224.61817: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096224.61845: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096224.61869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096224.61933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.61955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.61977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.62003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.62014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.62172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.62175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.62177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.62179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.62181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.62198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.62220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.62245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.62287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.62303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.62475: variable 'network_connections' from source: play vars 15500 1727096224.62492: variable 'profile' from source: play vars 15500 1727096224.62565: variable 'profile' from source: play vars 15500 1727096224.62576: variable 'interface' from source: set_fact 15500 1727096224.62635: variable 'interface' from source: set_fact 15500 1727096224.62693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096224.62825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096224.62853: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096224.62879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096224.62901: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096224.62934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096224.62950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096224.62971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.62989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096224.63026: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096224.63183: variable 'network_connections' from source: play vars 15500 1727096224.63186: variable 'profile' from source: play vars 15500 1727096224.63229: variable 'profile' from source: play vars 15500 1727096224.63232: variable 'interface' from source: set_fact 15500 1727096224.63280: variable 'interface' from source: set_fact 15500 1727096224.63299: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096224.63302: when evaluation is False, skipping this task 15500 1727096224.63305: _execute() done 15500 1727096224.63308: dumping result to json 15500 1727096224.63310: done dumping result, returning 15500 1727096224.63317: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-877d-2da0-000000000041] 15500 1727096224.63327: sending task result for task 0afff68d-5257-877d-2da0-000000000041 15500 1727096224.63408: done sending task result for task 0afff68d-5257-877d-2da0-000000000041 15500 1727096224.63411: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096224.63456: no more pending results, returning what we have 15500 1727096224.63460: results queue empty 15500 1727096224.63460: checking for any_errors_fatal 15500 1727096224.63465: done checking for any_errors_fatal 15500 1727096224.63466: checking for max_fail_percentage 15500 1727096224.63470: done checking for max_fail_percentage 15500 1727096224.63470: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.63471: done checking to see if all hosts have failed 15500 1727096224.63472: getting the remaining hosts for this loop 15500 1727096224.63473: done getting the remaining hosts for this loop 15500 1727096224.63477: getting the next task for host managed_node1 15500 1727096224.63483: done getting next task for host managed_node1 15500 1727096224.63487: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15500 1727096224.63489: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.63502: getting variables 15500 1727096224.63503: in VariableManager get_vars() 15500 1727096224.63540: Calling all_inventory to load vars for managed_node1 15500 1727096224.63542: Calling groups_inventory to load vars for managed_node1 15500 1727096224.63544: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.63553: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.63556: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.63558: Calling groups_plugins_play to load vars for managed_node1 15500 1727096224.64760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096224.66159: done with get_vars() 15500 1727096224.66204: done getting variables 15500 1727096224.66290: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:57:04 -0400 (0:00:00.084) 0:00:24.706 ****** 15500 1727096224.66381: entering _queue_task() for managed_node1/package 15500 1727096224.66926: worker is 1 (out of 1 available) 15500 1727096224.66941: exiting _queue_task() for managed_node1/package 15500 1727096224.66954: done queuing things up, now waiting for results queue to drain 15500 1727096224.66955: waiting for pending results... 15500 1727096224.67387: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15500 1727096224.67398: in run() - task 0afff68d-5257-877d-2da0-000000000042 15500 1727096224.67424: variable 'ansible_search_path' from source: unknown 15500 1727096224.67432: variable 'ansible_search_path' from source: unknown 15500 1727096224.67573: calling self._execute() 15500 1727096224.67611: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096224.67624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096224.67641: variable 'omit' from source: magic vars 15500 1727096224.68076: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.68099: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096224.68326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096224.68619: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096224.68675: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096224.68723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096224.68793: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096224.68899: variable 'network_packages' from source: role '' defaults 15500 1727096224.69005: variable '__network_provider_setup' from source: role '' defaults 15500 1727096224.69024: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096224.69123: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096224.69126: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096224.69236: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096224.69420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096224.80766: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096224.81076: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096224.81080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096224.81114: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096224.81139: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096224.81313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.81341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.81372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.81424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.81440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.81503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.81527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.81551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.81648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.81664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.82262: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15500 1727096224.82587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.82794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.82849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.83045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.83124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.83309: variable 'ansible_python' from source: facts 15500 1727096224.83337: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15500 1727096224.83436: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096224.83617: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096224.83746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.83773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.83798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.83842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.83859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.83904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096224.83977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096224.83980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.83996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096224.84010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096224.84195: variable 'network_connections' from source: play vars 15500 1727096224.84201: variable 'profile' from source: play vars 15500 1727096224.84305: variable 'profile' from source: play vars 15500 1727096224.84311: variable 'interface' from source: set_fact 15500 1727096224.84383: variable 'interface' from source: set_fact 15500 1727096224.84463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096224.84485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096224.84676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096224.84737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096224.84785: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096224.85486: variable 'network_connections' from source: play vars 15500 1727096224.85489: variable 'profile' from source: play vars 15500 1727096224.85777: variable 'profile' from source: play vars 15500 1727096224.85782: variable 'interface' from source: set_fact 15500 1727096224.85785: variable 'interface' from source: set_fact 15500 1727096224.85828: variable '__network_packages_default_wireless' from source: role '' defaults 15500 1727096224.85977: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096224.86476: variable 'network_connections' from source: play vars 15500 1727096224.86481: variable 'profile' from source: play vars 15500 1727096224.86622: variable 'profile' from source: play vars 15500 1727096224.86625: variable 'interface' from source: set_fact 15500 1727096224.86765: variable 'interface' from source: set_fact 15500 1727096224.86808: variable '__network_packages_default_team' from source: role '' defaults 15500 1727096224.86910: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096224.87902: variable 'network_connections' from source: play vars 15500 1727096224.87915: variable 'profile' from source: play vars 15500 1727096224.88206: variable 'profile' from source: play vars 15500 1727096224.88209: variable 'interface' from source: set_fact 15500 1727096224.88537: variable 'interface' from source: set_fact 15500 1727096224.88741: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096224.88869: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096224.88876: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096224.88944: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096224.89843: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15500 1727096224.91666: variable 'network_connections' from source: play vars 15500 1727096224.91675: variable 'profile' from source: play vars 15500 1727096224.91931: variable 'profile' from source: play vars 15500 1727096224.91960: variable 'interface' from source: set_fact 15500 1727096224.92275: variable 'interface' from source: set_fact 15500 1727096224.92285: variable 'ansible_distribution' from source: facts 15500 1727096224.92288: variable '__network_rh_distros' from source: role '' defaults 15500 1727096224.92295: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.92311: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15500 1727096224.92929: variable 'ansible_distribution' from source: facts 15500 1727096224.93080: variable '__network_rh_distros' from source: role '' defaults 15500 1727096224.93086: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.93100: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15500 1727096224.93655: variable 'ansible_distribution' from source: facts 15500 1727096224.93662: variable '__network_rh_distros' from source: role '' defaults 15500 1727096224.93665: variable 'ansible_distribution_major_version' from source: facts 15500 1727096224.93750: variable 'network_provider' from source: set_fact 15500 1727096224.93873: variable 'ansible_facts' from source: unknown 15500 1727096224.95480: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15500 1727096224.95491: when evaluation is False, skipping this task 15500 1727096224.95495: _execute() done 15500 1727096224.95500: dumping result to json 15500 1727096224.95502: done dumping result, returning 15500 1727096224.95512: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-877d-2da0-000000000042] 15500 1727096224.95515: sending task result for task 0afff68d-5257-877d-2da0-000000000042 15500 1727096224.95624: done sending task result for task 0afff68d-5257-877d-2da0-000000000042 15500 1727096224.95629: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15500 1727096224.95798: no more pending results, returning what we have 15500 1727096224.95802: results queue empty 15500 1727096224.95804: checking for any_errors_fatal 15500 1727096224.95812: done checking for any_errors_fatal 15500 1727096224.95813: checking for max_fail_percentage 15500 1727096224.95815: done checking for max_fail_percentage 15500 1727096224.95815: checking to see if all hosts have failed and the running result is not ok 15500 1727096224.95816: done checking to see if all hosts have failed 15500 1727096224.95817: getting the remaining hosts for this loop 15500 1727096224.95819: done getting the remaining hosts for this loop 15500 1727096224.95824: getting the next task for host managed_node1 15500 1727096224.95829: done getting next task for host managed_node1 15500 1727096224.95834: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15500 1727096224.95836: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096224.95855: getting variables 15500 1727096224.95857: in VariableManager get_vars() 15500 1727096224.96006: Calling all_inventory to load vars for managed_node1 15500 1727096224.96010: Calling groups_inventory to load vars for managed_node1 15500 1727096224.96012: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096224.96027: Calling all_plugins_play to load vars for managed_node1 15500 1727096224.96029: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096224.96032: Calling groups_plugins_play to load vars for managed_node1 15500 1727096225.09545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096225.12795: done with get_vars() 15500 1727096225.12832: done getting variables 15500 1727096225.13095: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:57:05 -0400 (0:00:00.467) 0:00:25.174 ****** 15500 1727096225.13123: entering _queue_task() for managed_node1/package 15500 1727096225.13696: worker is 1 (out of 1 available) 15500 1727096225.13708: exiting _queue_task() for managed_node1/package 15500 1727096225.13720: done queuing things up, now waiting for results queue to drain 15500 1727096225.13721: waiting for pending results... 15500 1727096225.14694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15500 1727096225.14975: in run() - task 0afff68d-5257-877d-2da0-000000000043 15500 1727096225.15073: variable 'ansible_search_path' from source: unknown 15500 1727096225.15078: variable 'ansible_search_path' from source: unknown 15500 1727096225.15081: calling self._execute() 15500 1727096225.15096: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096225.15109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096225.15125: variable 'omit' from source: magic vars 15500 1727096225.15981: variable 'ansible_distribution_major_version' from source: facts 15500 1727096225.16005: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096225.16277: variable 'network_state' from source: role '' defaults 15500 1727096225.16294: Evaluated conditional (network_state != {}): False 15500 1727096225.16302: when evaluation is False, skipping this task 15500 1727096225.16310: _execute() done 15500 1727096225.16317: dumping result to json 15500 1727096225.16330: done dumping result, returning 15500 1727096225.16346: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-877d-2da0-000000000043] 15500 1727096225.16381: sending task result for task 0afff68d-5257-877d-2da0-000000000043 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096225.16547: no more pending results, returning what we have 15500 1727096225.16551: results queue empty 15500 1727096225.16552: checking for any_errors_fatal 15500 1727096225.16566: done checking for any_errors_fatal 15500 1727096225.16570: checking for max_fail_percentage 15500 1727096225.16572: done checking for max_fail_percentage 15500 1727096225.16573: checking to see if all hosts have failed and the running result is not ok 15500 1727096225.16574: done checking to see if all hosts have failed 15500 1727096225.16574: getting the remaining hosts for this loop 15500 1727096225.16576: done getting the remaining hosts for this loop 15500 1727096225.16580: getting the next task for host managed_node1 15500 1727096225.16587: done getting next task for host managed_node1 15500 1727096225.16592: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15500 1727096225.16594: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096225.16609: getting variables 15500 1727096225.16611: in VariableManager get_vars() 15500 1727096225.16652: Calling all_inventory to load vars for managed_node1 15500 1727096225.16655: Calling groups_inventory to load vars for managed_node1 15500 1727096225.16659: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096225.16672: Calling all_plugins_play to load vars for managed_node1 15500 1727096225.16675: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096225.16678: Calling groups_plugins_play to load vars for managed_node1 15500 1727096225.17375: done sending task result for task 0afff68d-5257-877d-2da0-000000000043 15500 1727096225.17379: WORKER PROCESS EXITING 15500 1727096225.19683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096225.23135: done with get_vars() 15500 1727096225.23216: done getting variables 15500 1727096225.23280: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:57:05 -0400 (0:00:00.101) 0:00:25.276 ****** 15500 1727096225.23312: entering _queue_task() for managed_node1/package 15500 1727096225.24123: worker is 1 (out of 1 available) 15500 1727096225.24136: exiting _queue_task() for managed_node1/package 15500 1727096225.24147: done queuing things up, now waiting for results queue to drain 15500 1727096225.24149: waiting for pending results... 15500 1727096225.25186: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15500 1727096225.25195: in run() - task 0afff68d-5257-877d-2da0-000000000044 15500 1727096225.25199: variable 'ansible_search_path' from source: unknown 15500 1727096225.25202: variable 'ansible_search_path' from source: unknown 15500 1727096225.25573: calling self._execute() 15500 1727096225.25577: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096225.25581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096225.25584: variable 'omit' from source: magic vars 15500 1727096225.26726: variable 'ansible_distribution_major_version' from source: facts 15500 1727096225.27374: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096225.27377: variable 'network_state' from source: role '' defaults 15500 1727096225.27380: Evaluated conditional (network_state != {}): False 15500 1727096225.27383: when evaluation is False, skipping this task 15500 1727096225.27386: _execute() done 15500 1727096225.27388: dumping result to json 15500 1727096225.27390: done dumping result, returning 15500 1727096225.27392: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-877d-2da0-000000000044] 15500 1727096225.27395: sending task result for task 0afff68d-5257-877d-2da0-000000000044 15500 1727096225.28049: done sending task result for task 0afff68d-5257-877d-2da0-000000000044 15500 1727096225.28053: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096225.28102: no more pending results, returning what we have 15500 1727096225.28106: results queue empty 15500 1727096225.28107: checking for any_errors_fatal 15500 1727096225.28115: done checking for any_errors_fatal 15500 1727096225.28116: checking for max_fail_percentage 15500 1727096225.28117: done checking for max_fail_percentage 15500 1727096225.28118: checking to see if all hosts have failed and the running result is not ok 15500 1727096225.28119: done checking to see if all hosts have failed 15500 1727096225.28120: getting the remaining hosts for this loop 15500 1727096225.28121: done getting the remaining hosts for this loop 15500 1727096225.28128: getting the next task for host managed_node1 15500 1727096225.28134: done getting next task for host managed_node1 15500 1727096225.28138: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15500 1727096225.28140: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096225.28157: getting variables 15500 1727096225.28159: in VariableManager get_vars() 15500 1727096225.28202: Calling all_inventory to load vars for managed_node1 15500 1727096225.28205: Calling groups_inventory to load vars for managed_node1 15500 1727096225.28208: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096225.28219: Calling all_plugins_play to load vars for managed_node1 15500 1727096225.28222: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096225.28225: Calling groups_plugins_play to load vars for managed_node1 15500 1727096225.31405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096225.34843: done with get_vars() 15500 1727096225.34877: done getting variables 15500 1727096225.35048: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:57:05 -0400 (0:00:00.117) 0:00:25.393 ****** 15500 1727096225.35084: entering _queue_task() for managed_node1/service 15500 1727096225.35777: worker is 1 (out of 1 available) 15500 1727096225.35788: exiting _queue_task() for managed_node1/service 15500 1727096225.36021: done queuing things up, now waiting for results queue to drain 15500 1727096225.36022: waiting for pending results... 15500 1727096225.36342: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15500 1727096225.36554: in run() - task 0afff68d-5257-877d-2da0-000000000045 15500 1727096225.36683: variable 'ansible_search_path' from source: unknown 15500 1727096225.36687: variable 'ansible_search_path' from source: unknown 15500 1727096225.36722: calling self._execute() 15500 1727096225.36934: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096225.36940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096225.36951: variable 'omit' from source: magic vars 15500 1727096225.37832: variable 'ansible_distribution_major_version' from source: facts 15500 1727096225.37837: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096225.38173: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096225.38525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096225.43175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096225.43247: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096225.43361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096225.43392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096225.43419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096225.43615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.43644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.44272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.44276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.44278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.44281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.44283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.44285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.44287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.44673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.44677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.44679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.44685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.44725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.44738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.45232: variable 'network_connections' from source: play vars 15500 1727096225.45244: variable 'profile' from source: play vars 15500 1727096225.45521: variable 'profile' from source: play vars 15500 1727096225.45524: variable 'interface' from source: set_fact 15500 1727096225.45591: variable 'interface' from source: set_fact 15500 1727096225.45658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096225.45963: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096225.46107: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096225.46136: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096225.46166: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096225.46212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096225.46234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096225.46259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.46492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096225.46540: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096225.47197: variable 'network_connections' from source: play vars 15500 1727096225.47200: variable 'profile' from source: play vars 15500 1727096225.47272: variable 'profile' from source: play vars 15500 1727096225.47479: variable 'interface' from source: set_fact 15500 1727096225.47539: variable 'interface' from source: set_fact 15500 1727096225.47570: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096225.47574: when evaluation is False, skipping this task 15500 1727096225.47576: _execute() done 15500 1727096225.47579: dumping result to json 15500 1727096225.47581: done dumping result, returning 15500 1727096225.47592: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-877d-2da0-000000000045] 15500 1727096225.47603: sending task result for task 0afff68d-5257-877d-2da0-000000000045 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096225.47845: no more pending results, returning what we have 15500 1727096225.47848: results queue empty 15500 1727096225.47849: checking for any_errors_fatal 15500 1727096225.47856: done checking for any_errors_fatal 15500 1727096225.47856: checking for max_fail_percentage 15500 1727096225.47858: done checking for max_fail_percentage 15500 1727096225.47859: checking to see if all hosts have failed and the running result is not ok 15500 1727096225.47859: done checking to see if all hosts have failed 15500 1727096225.47860: getting the remaining hosts for this loop 15500 1727096225.47862: done getting the remaining hosts for this loop 15500 1727096225.47866: getting the next task for host managed_node1 15500 1727096225.47874: done getting next task for host managed_node1 15500 1727096225.47878: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15500 1727096225.47880: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096225.47893: getting variables 15500 1727096225.47894: in VariableManager get_vars() 15500 1727096225.47931: Calling all_inventory to load vars for managed_node1 15500 1727096225.47933: Calling groups_inventory to load vars for managed_node1 15500 1727096225.47935: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096225.47944: Calling all_plugins_play to load vars for managed_node1 15500 1727096225.47946: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096225.47949: Calling groups_plugins_play to load vars for managed_node1 15500 1727096225.48776: done sending task result for task 0afff68d-5257-877d-2da0-000000000045 15500 1727096225.48780: WORKER PROCESS EXITING 15500 1727096225.51928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096225.57788: done with get_vars() 15500 1727096225.57826: done getting variables 15500 1727096225.57891: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:57:05 -0400 (0:00:00.228) 0:00:25.622 ****** 15500 1727096225.57924: entering _queue_task() for managed_node1/service 15500 1727096225.59212: worker is 1 (out of 1 available) 15500 1727096225.59225: exiting _queue_task() for managed_node1/service 15500 1727096225.59239: done queuing things up, now waiting for results queue to drain 15500 1727096225.59240: waiting for pending results... 15500 1727096225.59694: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15500 1727096225.59959: in run() - task 0afff68d-5257-877d-2da0-000000000046 15500 1727096225.60031: variable 'ansible_search_path' from source: unknown 15500 1727096225.60035: variable 'ansible_search_path' from source: unknown 15500 1727096225.60038: calling self._execute() 15500 1727096225.60139: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096225.60143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096225.60148: variable 'omit' from source: magic vars 15500 1727096225.60975: variable 'ansible_distribution_major_version' from source: facts 15500 1727096225.60979: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096225.61313: variable 'network_provider' from source: set_fact 15500 1727096225.61318: variable 'network_state' from source: role '' defaults 15500 1727096225.61329: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15500 1727096225.61335: variable 'omit' from source: magic vars 15500 1727096225.61764: variable 'omit' from source: magic vars 15500 1727096225.61770: variable 'network_service_name' from source: role '' defaults 15500 1727096225.61773: variable 'network_service_name' from source: role '' defaults 15500 1727096225.61997: variable '__network_provider_setup' from source: role '' defaults 15500 1727096225.62001: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096225.62066: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096225.62077: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096225.62135: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096225.62573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096225.66926: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096225.67151: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096225.67373: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096225.67377: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096225.67430: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096225.67773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.67778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.67780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.67782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.67785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.67984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.68012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.68040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.68372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.68376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.68542: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15500 1727096225.68890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.68920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.68948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.69014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.69090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.69365: variable 'ansible_python' from source: facts 15500 1727096225.69395: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15500 1727096225.69488: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096225.69652: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096225.70174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.70178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.70180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.70209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.70229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.70288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096225.70572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096225.70576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.70578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096225.70589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096225.70972: variable 'network_connections' from source: play vars 15500 1727096225.70976: variable 'profile' from source: play vars 15500 1727096225.71025: variable 'profile' from source: play vars 15500 1727096225.71039: variable 'interface' from source: set_fact 15500 1727096225.71108: variable 'interface' from source: set_fact 15500 1727096225.71262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096225.71777: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096225.71826: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096225.71918: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096225.72173: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096225.72184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096225.72218: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096225.72672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096225.72676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096225.72679: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096225.73100: variable 'network_connections' from source: play vars 15500 1727096225.73113: variable 'profile' from source: play vars 15500 1727096225.73195: variable 'profile' from source: play vars 15500 1727096225.73382: variable 'interface' from source: set_fact 15500 1727096225.73446: variable 'interface' from source: set_fact 15500 1727096225.73490: variable '__network_packages_default_wireless' from source: role '' defaults 15500 1727096225.73575: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096225.74072: variable 'network_connections' from source: play vars 15500 1727096225.74282: variable 'profile' from source: play vars 15500 1727096225.74356: variable 'profile' from source: play vars 15500 1727096225.74673: variable 'interface' from source: set_fact 15500 1727096225.74677: variable 'interface' from source: set_fact 15500 1727096225.74692: variable '__network_packages_default_team' from source: role '' defaults 15500 1727096225.74777: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096225.75262: variable 'network_connections' from source: play vars 15500 1727096225.75483: variable 'profile' from source: play vars 15500 1727096225.75556: variable 'profile' from source: play vars 15500 1727096225.75873: variable 'interface' from source: set_fact 15500 1727096225.75877: variable 'interface' from source: set_fact 15500 1727096225.75924: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096225.75987: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096225.76273: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096225.76277: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096225.76675: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15500 1727096225.77875: variable 'network_connections' from source: play vars 15500 1727096225.77879: variable 'profile' from source: play vars 15500 1727096225.77896: variable 'profile' from source: play vars 15500 1727096225.77905: variable 'interface' from source: set_fact 15500 1727096225.77980: variable 'interface' from source: set_fact 15500 1727096225.78274: variable 'ansible_distribution' from source: facts 15500 1727096225.78278: variable '__network_rh_distros' from source: role '' defaults 15500 1727096225.78281: variable 'ansible_distribution_major_version' from source: facts 15500 1727096225.78283: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15500 1727096225.78386: variable 'ansible_distribution' from source: facts 15500 1727096225.78481: variable '__network_rh_distros' from source: role '' defaults 15500 1727096225.78491: variable 'ansible_distribution_major_version' from source: facts 15500 1727096225.78508: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15500 1727096225.79073: variable 'ansible_distribution' from source: facts 15500 1727096225.79077: variable '__network_rh_distros' from source: role '' defaults 15500 1727096225.79079: variable 'ansible_distribution_major_version' from source: facts 15500 1727096225.79081: variable 'network_provider' from source: set_fact 15500 1727096225.79083: variable 'omit' from source: magic vars 15500 1727096225.79097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096225.79131: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096225.79162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096225.79186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096225.79472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096225.79476: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096225.79483: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096225.79485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096225.79543: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096225.79554: Set connection var ansible_pipelining to False 15500 1727096225.79567: Set connection var ansible_timeout to 10 15500 1727096225.79576: Set connection var ansible_shell_type to sh 15500 1727096225.79585: Set connection var ansible_shell_executable to /bin/sh 15500 1727096225.79873: Set connection var ansible_connection to ssh 15500 1727096225.79876: variable 'ansible_shell_executable' from source: unknown 15500 1727096225.79878: variable 'ansible_connection' from source: unknown 15500 1727096225.79882: variable 'ansible_module_compression' from source: unknown 15500 1727096225.79884: variable 'ansible_shell_type' from source: unknown 15500 1727096225.79886: variable 'ansible_shell_executable' from source: unknown 15500 1727096225.79887: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096225.79895: variable 'ansible_pipelining' from source: unknown 15500 1727096225.79897: variable 'ansible_timeout' from source: unknown 15500 1727096225.79899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096225.79973: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096225.79989: variable 'omit' from source: magic vars 15500 1727096225.79999: starting attempt loop 15500 1727096225.80178: running the handler 15500 1727096225.80270: variable 'ansible_facts' from source: unknown 15500 1727096225.81671: _low_level_execute_command(): starting 15500 1727096225.81884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096225.83391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096225.83420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096225.83448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096225.83565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096225.85281: stdout chunk (state=3): >>>/root <<< 15500 1727096225.85536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096225.85550: stdout chunk (state=3): >>><<< 15500 1727096225.85569: stderr chunk (state=3): >>><<< 15500 1727096225.85594: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096225.85613: _low_level_execute_command(): starting 15500 1727096225.85851: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573 `" && echo ansible-tmp-1727096225.8560119-16552-160736636691573="` echo /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573 `" ) && sleep 0' 15500 1727096225.86805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096225.87076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096225.87120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096225.87186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096225.89167: stdout chunk (state=3): >>>ansible-tmp-1727096225.8560119-16552-160736636691573=/root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573 <<< 15500 1727096225.89312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096225.89327: stdout chunk (state=3): >>><<< 15500 1727096225.89338: stderr chunk (state=3): >>><<< 15500 1727096225.89360: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096225.8560119-16552-160736636691573=/root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096225.89399: variable 'ansible_module_compression' from source: unknown 15500 1727096225.89623: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15500 1727096225.89688: variable 'ansible_facts' from source: unknown 15500 1727096225.90106: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/AnsiballZ_systemd.py 15500 1727096225.90705: Sending initial data 15500 1727096225.90709: Sent initial data (156 bytes) 15500 1727096225.91990: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096225.92008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096225.92020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096225.92126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096225.92383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096225.92413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096225.94149: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096225.94211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp8gif_h4w /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/AnsiballZ_systemd.py <<< 15500 1727096225.94215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/AnsiballZ_systemd.py" <<< 15500 1727096225.94600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp8gif_h4w" to remote "/root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/AnsiballZ_systemd.py" <<< 15500 1727096225.98865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096225.98873: stdout chunk (state=3): >>><<< 15500 1727096225.98876: stderr chunk (state=3): >>><<< 15500 1727096225.98914: done transferring module to remote 15500 1727096225.98925: _low_level_execute_command(): starting 15500 1727096225.98930: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/ /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/AnsiballZ_systemd.py && sleep 0' 15500 1727096226.00373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096226.00378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096226.00382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096226.00442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096226.00454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096226.00556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096226.02482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096226.02486: stdout chunk (state=3): >>><<< 15500 1727096226.02494: stderr chunk (state=3): >>><<< 15500 1727096226.02520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096226.02523: _low_level_execute_command(): starting 15500 1727096226.02529: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/AnsiballZ_systemd.py && sleep 0' 15500 1727096226.03973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096226.03976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096226.03979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096226.03981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096226.04029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096226.33644: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302531072", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "757641000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15500 1727096226.33687: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15500 1727096226.35789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096226.35825: stderr chunk (state=3): >>><<< 15500 1727096226.35835: stdout chunk (state=3): >>><<< 15500 1727096226.35857: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3302531072", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "757641000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096226.36323: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096226.36394: _low_level_execute_command(): starting 15500 1727096226.36403: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096225.8560119-16552-160736636691573/ > /dev/null 2>&1 && sleep 0' 15500 1727096226.38052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096226.38056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096226.38058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15500 1727096226.38060: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096226.38063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096226.38282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096226.38314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096226.38442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096226.40400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096226.40404: stdout chunk (state=3): >>><<< 15500 1727096226.40405: stderr chunk (state=3): >>><<< 15500 1727096226.40431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096226.40445: handler run complete 15500 1727096226.40886: attempt loop complete, returning result 15500 1727096226.40890: _execute() done 15500 1727096226.40892: dumping result to json 15500 1727096226.40894: done dumping result, returning 15500 1727096226.40898: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-877d-2da0-000000000046] 15500 1727096226.40900: sending task result for task 0afff68d-5257-877d-2da0-000000000046 15500 1727096226.41462: done sending task result for task 0afff68d-5257-877d-2da0-000000000046 15500 1727096226.41466: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096226.41886: no more pending results, returning what we have 15500 1727096226.41890: results queue empty 15500 1727096226.41891: checking for any_errors_fatal 15500 1727096226.41896: done checking for any_errors_fatal 15500 1727096226.41897: checking for max_fail_percentage 15500 1727096226.41899: done checking for max_fail_percentage 15500 1727096226.41900: checking to see if all hosts have failed and the running result is not ok 15500 1727096226.41901: done checking to see if all hosts have failed 15500 1727096226.41901: getting the remaining hosts for this loop 15500 1727096226.41903: done getting the remaining hosts for this loop 15500 1727096226.41907: getting the next task for host managed_node1 15500 1727096226.41912: done getting next task for host managed_node1 15500 1727096226.41916: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15500 1727096226.41918: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096226.41928: getting variables 15500 1727096226.41929: in VariableManager get_vars() 15500 1727096226.41960: Calling all_inventory to load vars for managed_node1 15500 1727096226.41963: Calling groups_inventory to load vars for managed_node1 15500 1727096226.41965: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096226.41976: Calling all_plugins_play to load vars for managed_node1 15500 1727096226.41978: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096226.41981: Calling groups_plugins_play to load vars for managed_node1 15500 1727096226.44709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096226.48151: done with get_vars() 15500 1727096226.48290: done getting variables 15500 1727096226.48355: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:57:06 -0400 (0:00:00.904) 0:00:26.527 ****** 15500 1727096226.48390: entering _queue_task() for managed_node1/service 15500 1727096226.49265: worker is 1 (out of 1 available) 15500 1727096226.49281: exiting _queue_task() for managed_node1/service 15500 1727096226.49411: done queuing things up, now waiting for results queue to drain 15500 1727096226.49413: waiting for pending results... 15500 1727096226.49921: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15500 1727096226.50278: in run() - task 0afff68d-5257-877d-2da0-000000000047 15500 1727096226.50283: variable 'ansible_search_path' from source: unknown 15500 1727096226.50286: variable 'ansible_search_path' from source: unknown 15500 1727096226.50289: calling self._execute() 15500 1727096226.50542: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096226.50556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096226.50576: variable 'omit' from source: magic vars 15500 1727096226.51491: variable 'ansible_distribution_major_version' from source: facts 15500 1727096226.51495: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096226.51725: variable 'network_provider' from source: set_fact 15500 1727096226.51816: Evaluated conditional (network_provider == "nm"): True 15500 1727096226.52115: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096226.52118: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096226.52581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096226.57886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096226.58004: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096226.58164: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096226.58261: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096226.58371: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096226.58434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096226.58664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096226.58668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096226.58672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096226.58797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096226.58835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096226.58911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096226.58941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096226.59376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096226.59380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096226.59382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096226.59483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096226.59486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096226.59488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096226.59490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096226.59844: variable 'network_connections' from source: play vars 15500 1727096226.59927: variable 'profile' from source: play vars 15500 1727096226.60064: variable 'profile' from source: play vars 15500 1727096226.60112: variable 'interface' from source: set_fact 15500 1727096226.60243: variable 'interface' from source: set_fact 15500 1727096226.60397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096226.60811: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096226.60906: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096226.61006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096226.61039: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096226.61129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096226.61218: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096226.61248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096226.61519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096226.61522: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096226.61922: variable 'network_connections' from source: play vars 15500 1727096226.62133: variable 'profile' from source: play vars 15500 1727096226.62383: variable 'profile' from source: play vars 15500 1727096226.62387: variable 'interface' from source: set_fact 15500 1727096226.62388: variable 'interface' from source: set_fact 15500 1727096226.62524: Evaluated conditional (__network_wpa_supplicant_required): False 15500 1727096226.62573: when evaluation is False, skipping this task 15500 1727096226.62576: _execute() done 15500 1727096226.62587: dumping result to json 15500 1727096226.62589: done dumping result, returning 15500 1727096226.62592: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-877d-2da0-000000000047] 15500 1727096226.62593: sending task result for task 0afff68d-5257-877d-2da0-000000000047 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15500 1727096226.62712: no more pending results, returning what we have 15500 1727096226.62715: results queue empty 15500 1727096226.62716: checking for any_errors_fatal 15500 1727096226.62733: done checking for any_errors_fatal 15500 1727096226.62734: checking for max_fail_percentage 15500 1727096226.62736: done checking for max_fail_percentage 15500 1727096226.62737: checking to see if all hosts have failed and the running result is not ok 15500 1727096226.62737: done checking to see if all hosts have failed 15500 1727096226.62738: getting the remaining hosts for this loop 15500 1727096226.62740: done getting the remaining hosts for this loop 15500 1727096226.62744: getting the next task for host managed_node1 15500 1727096226.62750: done getting next task for host managed_node1 15500 1727096226.62754: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15500 1727096226.62756: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096226.62771: getting variables 15500 1727096226.62773: in VariableManager get_vars() 15500 1727096226.62814: Calling all_inventory to load vars for managed_node1 15500 1727096226.62817: Calling groups_inventory to load vars for managed_node1 15500 1727096226.62820: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096226.62830: Calling all_plugins_play to load vars for managed_node1 15500 1727096226.62833: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096226.62836: Calling groups_plugins_play to load vars for managed_node1 15500 1727096226.64053: done sending task result for task 0afff68d-5257-877d-2da0-000000000047 15500 1727096226.64056: WORKER PROCESS EXITING 15500 1727096226.66390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096226.68063: done with get_vars() 15500 1727096226.68088: done getting variables 15500 1727096226.68145: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:57:06 -0400 (0:00:00.200) 0:00:26.728 ****** 15500 1727096226.68491: entering _queue_task() for managed_node1/service 15500 1727096226.69101: worker is 1 (out of 1 available) 15500 1727096226.69114: exiting _queue_task() for managed_node1/service 15500 1727096226.69126: done queuing things up, now waiting for results queue to drain 15500 1727096226.69246: waiting for pending results... 15500 1727096226.69458: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15500 1727096226.69583: in run() - task 0afff68d-5257-877d-2da0-000000000048 15500 1727096226.69613: variable 'ansible_search_path' from source: unknown 15500 1727096226.69622: variable 'ansible_search_path' from source: unknown 15500 1727096226.69664: calling self._execute() 15500 1727096226.69775: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096226.69795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096226.69814: variable 'omit' from source: magic vars 15500 1727096226.70213: variable 'ansible_distribution_major_version' from source: facts 15500 1727096226.70236: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096226.70672: variable 'network_provider' from source: set_fact 15500 1727096226.70677: Evaluated conditional (network_provider == "initscripts"): False 15500 1727096226.70679: when evaluation is False, skipping this task 15500 1727096226.70681: _execute() done 15500 1727096226.70684: dumping result to json 15500 1727096226.70686: done dumping result, returning 15500 1727096226.70688: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-877d-2da0-000000000048] 15500 1727096226.70690: sending task result for task 0afff68d-5257-877d-2da0-000000000048 15500 1727096226.70767: done sending task result for task 0afff68d-5257-877d-2da0-000000000048 15500 1727096226.70875: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096226.70925: no more pending results, returning what we have 15500 1727096226.70930: results queue empty 15500 1727096226.70931: checking for any_errors_fatal 15500 1727096226.70942: done checking for any_errors_fatal 15500 1727096226.70943: checking for max_fail_percentage 15500 1727096226.70945: done checking for max_fail_percentage 15500 1727096226.70946: checking to see if all hosts have failed and the running result is not ok 15500 1727096226.70946: done checking to see if all hosts have failed 15500 1727096226.70947: getting the remaining hosts for this loop 15500 1727096226.70950: done getting the remaining hosts for this loop 15500 1727096226.70954: getting the next task for host managed_node1 15500 1727096226.70961: done getting next task for host managed_node1 15500 1727096226.70965: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15500 1727096226.70970: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096226.70985: getting variables 15500 1727096226.70987: in VariableManager get_vars() 15500 1727096226.71026: Calling all_inventory to load vars for managed_node1 15500 1727096226.71029: Calling groups_inventory to load vars for managed_node1 15500 1727096226.71031: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096226.71044: Calling all_plugins_play to load vars for managed_node1 15500 1727096226.71047: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096226.71050: Calling groups_plugins_play to load vars for managed_node1 15500 1727096226.74065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096226.77028: done with get_vars() 15500 1727096226.77171: done getting variables 15500 1727096226.77230: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:57:06 -0400 (0:00:00.087) 0:00:26.815 ****** 15500 1727096226.77264: entering _queue_task() for managed_node1/copy 15500 1727096226.78015: worker is 1 (out of 1 available) 15500 1727096226.78027: exiting _queue_task() for managed_node1/copy 15500 1727096226.78153: done queuing things up, now waiting for results queue to drain 15500 1727096226.78155: waiting for pending results... 15500 1727096226.79262: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15500 1727096226.79585: in run() - task 0afff68d-5257-877d-2da0-000000000049 15500 1727096226.79589: variable 'ansible_search_path' from source: unknown 15500 1727096226.79592: variable 'ansible_search_path' from source: unknown 15500 1727096226.79690: calling self._execute() 15500 1727096226.80016: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096226.80020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096226.80024: variable 'omit' from source: magic vars 15500 1727096226.81132: variable 'ansible_distribution_major_version' from source: facts 15500 1727096226.81257: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096226.81531: variable 'network_provider' from source: set_fact 15500 1727096226.81535: Evaluated conditional (network_provider == "initscripts"): False 15500 1727096226.81540: when evaluation is False, skipping this task 15500 1727096226.81542: _execute() done 15500 1727096226.81544: dumping result to json 15500 1727096226.81550: done dumping result, returning 15500 1727096226.81777: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-877d-2da0-000000000049] 15500 1727096226.81780: sending task result for task 0afff68d-5257-877d-2da0-000000000049 15500 1727096226.81854: done sending task result for task 0afff68d-5257-877d-2da0-000000000049 15500 1727096226.81861: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15500 1727096226.81923: no more pending results, returning what we have 15500 1727096226.81926: results queue empty 15500 1727096226.81927: checking for any_errors_fatal 15500 1727096226.81934: done checking for any_errors_fatal 15500 1727096226.81934: checking for max_fail_percentage 15500 1727096226.81936: done checking for max_fail_percentage 15500 1727096226.81937: checking to see if all hosts have failed and the running result is not ok 15500 1727096226.81938: done checking to see if all hosts have failed 15500 1727096226.81938: getting the remaining hosts for this loop 15500 1727096226.81940: done getting the remaining hosts for this loop 15500 1727096226.81943: getting the next task for host managed_node1 15500 1727096226.81949: done getting next task for host managed_node1 15500 1727096226.81952: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15500 1727096226.81954: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096226.82160: getting variables 15500 1727096226.82163: in VariableManager get_vars() 15500 1727096226.82201: Calling all_inventory to load vars for managed_node1 15500 1727096226.82204: Calling groups_inventory to load vars for managed_node1 15500 1727096226.82206: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096226.82214: Calling all_plugins_play to load vars for managed_node1 15500 1727096226.82217: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096226.82219: Calling groups_plugins_play to load vars for managed_node1 15500 1727096226.86212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096226.89297: done with get_vars() 15500 1727096226.89328: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:57:06 -0400 (0:00:00.121) 0:00:26.937 ****** 15500 1727096226.89432: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15500 1727096226.89931: worker is 1 (out of 1 available) 15500 1727096226.89943: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15500 1727096226.89954: done queuing things up, now waiting for results queue to drain 15500 1727096226.89955: waiting for pending results... 15500 1727096226.90269: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15500 1727096226.90279: in run() - task 0afff68d-5257-877d-2da0-00000000004a 15500 1727096226.90302: variable 'ansible_search_path' from source: unknown 15500 1727096226.90309: variable 'ansible_search_path' from source: unknown 15500 1727096226.90347: calling self._execute() 15500 1727096226.90474: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096226.90488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096226.90508: variable 'omit' from source: magic vars 15500 1727096226.90942: variable 'ansible_distribution_major_version' from source: facts 15500 1727096226.90946: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096226.90949: variable 'omit' from source: magic vars 15500 1727096226.90987: variable 'omit' from source: magic vars 15500 1727096226.91171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096226.95184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096226.95246: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096226.95318: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096226.95502: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096226.95506: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096226.95566: variable 'network_provider' from source: set_fact 15500 1727096226.95808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096226.95872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096226.95911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096226.95970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096226.95992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096226.96081: variable 'omit' from source: magic vars 15500 1727096226.96229: variable 'omit' from source: magic vars 15500 1727096226.96395: variable 'network_connections' from source: play vars 15500 1727096226.96420: variable 'profile' from source: play vars 15500 1727096226.96542: variable 'profile' from source: play vars 15500 1727096226.96594: variable 'interface' from source: set_fact 15500 1727096226.96623: variable 'interface' from source: set_fact 15500 1727096226.96847: variable 'omit' from source: magic vars 15500 1727096226.96876: variable '__lsr_ansible_managed' from source: task vars 15500 1727096226.96947: variable '__lsr_ansible_managed' from source: task vars 15500 1727096226.97144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15500 1727096226.97718: Loaded config def from plugin (lookup/template) 15500 1727096226.97772: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15500 1727096226.97776: File lookup term: get_ansible_managed.j2 15500 1727096226.97778: variable 'ansible_search_path' from source: unknown 15500 1727096226.97780: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15500 1727096226.97801: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15500 1727096226.97825: variable 'ansible_search_path' from source: unknown 15500 1727096227.11491: variable 'ansible_managed' from source: unknown 15500 1727096227.11769: variable 'omit' from source: magic vars 15500 1727096227.11836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096227.11880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096227.11903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096227.11928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096227.11952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096227.12064: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096227.12069: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.12072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.12144: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096227.12147: Set connection var ansible_pipelining to False 15500 1727096227.12154: Set connection var ansible_timeout to 10 15500 1727096227.12164: Set connection var ansible_shell_type to sh 15500 1727096227.12491: Set connection var ansible_shell_executable to /bin/sh 15500 1727096227.12494: Set connection var ansible_connection to ssh 15500 1727096227.12496: variable 'ansible_shell_executable' from source: unknown 15500 1727096227.12498: variable 'ansible_connection' from source: unknown 15500 1727096227.12500: variable 'ansible_module_compression' from source: unknown 15500 1727096227.12503: variable 'ansible_shell_type' from source: unknown 15500 1727096227.12505: variable 'ansible_shell_executable' from source: unknown 15500 1727096227.12506: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.12508: variable 'ansible_pipelining' from source: unknown 15500 1727096227.12510: variable 'ansible_timeout' from source: unknown 15500 1727096227.12512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.12514: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096227.12524: variable 'omit' from source: magic vars 15500 1727096227.12526: starting attempt loop 15500 1727096227.12531: running the handler 15500 1727096227.12534: _low_level_execute_command(): starting 15500 1727096227.12536: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096227.13346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096227.13352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096227.13356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096227.13361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096227.13405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096227.15121: stdout chunk (state=3): >>>/root <<< 15500 1727096227.15272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096227.15281: stdout chunk (state=3): >>><<< 15500 1727096227.15289: stderr chunk (state=3): >>><<< 15500 1727096227.15346: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096227.15360: _low_level_execute_command(): starting 15500 1727096227.15370: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747 `" && echo ansible-tmp-1727096227.1534646-16598-125351137724747="` echo /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747 `" ) && sleep 0' 15500 1727096227.16174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096227.16178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096227.16180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096227.16195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096227.16206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096227.16213: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096227.16287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096227.16399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096227.16512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096227.16639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096227.18528: stdout chunk (state=3): >>>ansible-tmp-1727096227.1534646-16598-125351137724747=/root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747 <<< 15500 1727096227.18665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096227.18671: stdout chunk (state=3): >>><<< 15500 1727096227.18684: stderr chunk (state=3): >>><<< 15500 1727096227.18726: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096227.1534646-16598-125351137724747=/root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096227.18740: variable 'ansible_module_compression' from source: unknown 15500 1727096227.18783: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15500 1727096227.18809: variable 'ansible_facts' from source: unknown 15500 1727096227.18880: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/AnsiballZ_network_connections.py 15500 1727096227.19059: Sending initial data 15500 1727096227.19066: Sent initial data (168 bytes) 15500 1727096227.19539: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096227.19543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096227.19576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096227.19620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096227.19624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096227.19629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096227.19705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096227.21361: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096227.21407: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096227.21475: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpoi1sfx8i /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/AnsiballZ_network_connections.py <<< 15500 1727096227.21478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/AnsiballZ_network_connections.py" <<< 15500 1727096227.21539: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpoi1sfx8i" to remote "/root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/AnsiballZ_network_connections.py" <<< 15500 1727096227.21543: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/AnsiballZ_network_connections.py" <<< 15500 1727096227.22499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096227.22574: stderr chunk (state=3): >>><<< 15500 1727096227.22578: stdout chunk (state=3): >>><<< 15500 1727096227.22606: done transferring module to remote 15500 1727096227.22609: _low_level_execute_command(): starting 15500 1727096227.22620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/ /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/AnsiballZ_network_connections.py && sleep 0' 15500 1727096227.23397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096227.23431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096227.23454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096227.23461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096227.23559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096227.25422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096227.25450: stderr chunk (state=3): >>><<< 15500 1727096227.25452: stdout chunk (state=3): >>><<< 15500 1727096227.25471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096227.25477: _low_level_execute_command(): starting 15500 1727096227.25483: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/AnsiballZ_network_connections.py && sleep 0' 15500 1727096227.26035: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096227.26061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096227.26106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096227.26164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096227.57602: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15500 1727096227.59928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096227.60053: stderr chunk (state=3): >>><<< 15500 1727096227.60056: stdout chunk (state=3): >>><<< 15500 1727096227.60058: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096227.60061: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096227.60063: _low_level_execute_command(): starting 15500 1727096227.60065: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096227.1534646-16598-125351137724747/ > /dev/null 2>&1 && sleep 0' 15500 1727096227.61474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096227.61478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096227.61480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096227.61483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096227.61489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096227.61496: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096227.61507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096227.61519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096227.61527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096227.61533: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15500 1727096227.61541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096227.61551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096227.61574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096227.61582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096227.61588: stderr chunk (state=3): >>>debug2: match found <<< 15500 1727096227.61599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096227.61770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096227.61794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096227.63809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096227.63813: stdout chunk (state=3): >>><<< 15500 1727096227.63818: stderr chunk (state=3): >>><<< 15500 1727096227.63875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096227.63881: handler run complete 15500 1727096227.63908: attempt loop complete, returning result 15500 1727096227.63912: _execute() done 15500 1727096227.63914: dumping result to json 15500 1727096227.63917: done dumping result, returning 15500 1727096227.63927: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-877d-2da0-00000000004a] 15500 1727096227.63930: sending task result for task 0afff68d-5257-877d-2da0-00000000004a 15500 1727096227.64151: done sending task result for task 0afff68d-5257-877d-2da0-00000000004a 15500 1727096227.64161: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15500 1727096227.64274: no more pending results, returning what we have 15500 1727096227.64278: results queue empty 15500 1727096227.64278: checking for any_errors_fatal 15500 1727096227.64372: done checking for any_errors_fatal 15500 1727096227.64374: checking for max_fail_percentage 15500 1727096227.64376: done checking for max_fail_percentage 15500 1727096227.64377: checking to see if all hosts have failed and the running result is not ok 15500 1727096227.64378: done checking to see if all hosts have failed 15500 1727096227.64378: getting the remaining hosts for this loop 15500 1727096227.64380: done getting the remaining hosts for this loop 15500 1727096227.64384: getting the next task for host managed_node1 15500 1727096227.64390: done getting next task for host managed_node1 15500 1727096227.64512: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15500 1727096227.64514: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096227.64525: getting variables 15500 1727096227.64526: in VariableManager get_vars() 15500 1727096227.64563: Calling all_inventory to load vars for managed_node1 15500 1727096227.64566: Calling groups_inventory to load vars for managed_node1 15500 1727096227.64677: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096227.64687: Calling all_plugins_play to load vars for managed_node1 15500 1727096227.64690: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096227.64693: Calling groups_plugins_play to load vars for managed_node1 15500 1727096227.68124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096227.71569: done with get_vars() 15500 1727096227.71687: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:57:07 -0400 (0:00:00.824) 0:00:27.761 ****** 15500 1727096227.71871: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15500 1727096227.72653: worker is 1 (out of 1 available) 15500 1727096227.72666: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15500 1727096227.72731: done queuing things up, now waiting for results queue to drain 15500 1727096227.72733: waiting for pending results... 15500 1727096227.73285: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15500 1727096227.73473: in run() - task 0afff68d-5257-877d-2da0-00000000004b 15500 1727096227.73673: variable 'ansible_search_path' from source: unknown 15500 1727096227.73677: variable 'ansible_search_path' from source: unknown 15500 1727096227.73686: calling self._execute() 15500 1727096227.73689: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.73692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.73694: variable 'omit' from source: magic vars 15500 1727096227.74425: variable 'ansible_distribution_major_version' from source: facts 15500 1727096227.74873: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096227.74877: variable 'network_state' from source: role '' defaults 15500 1727096227.74880: Evaluated conditional (network_state != {}): False 15500 1727096227.74882: when evaluation is False, skipping this task 15500 1727096227.74885: _execute() done 15500 1727096227.74887: dumping result to json 15500 1727096227.74889: done dumping result, returning 15500 1727096227.74891: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-877d-2da0-00000000004b] 15500 1727096227.74895: sending task result for task 0afff68d-5257-877d-2da0-00000000004b skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096227.75052: no more pending results, returning what we have 15500 1727096227.75056: results queue empty 15500 1727096227.75057: checking for any_errors_fatal 15500 1727096227.75074: done checking for any_errors_fatal 15500 1727096227.75075: checking for max_fail_percentage 15500 1727096227.75076: done checking for max_fail_percentage 15500 1727096227.75077: checking to see if all hosts have failed and the running result is not ok 15500 1727096227.75078: done checking to see if all hosts have failed 15500 1727096227.75079: getting the remaining hosts for this loop 15500 1727096227.75081: done getting the remaining hosts for this loop 15500 1727096227.75085: getting the next task for host managed_node1 15500 1727096227.75093: done getting next task for host managed_node1 15500 1727096227.75099: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15500 1727096227.75109: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096227.75125: getting variables 15500 1727096227.75127: in VariableManager get_vars() 15500 1727096227.75218: Calling all_inventory to load vars for managed_node1 15500 1727096227.75222: Calling groups_inventory to load vars for managed_node1 15500 1727096227.75224: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096227.75236: Calling all_plugins_play to load vars for managed_node1 15500 1727096227.75239: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096227.75241: Calling groups_plugins_play to load vars for managed_node1 15500 1727096227.75898: done sending task result for task 0afff68d-5257-877d-2da0-00000000004b 15500 1727096227.75902: WORKER PROCESS EXITING 15500 1727096227.79225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096227.83115: done with get_vars() 15500 1727096227.83146: done getting variables 15500 1727096227.83324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:57:07 -0400 (0:00:00.114) 0:00:27.876 ****** 15500 1727096227.83356: entering _queue_task() for managed_node1/debug 15500 1727096227.84108: worker is 1 (out of 1 available) 15500 1727096227.84122: exiting _queue_task() for managed_node1/debug 15500 1727096227.84235: done queuing things up, now waiting for results queue to drain 15500 1727096227.84237: waiting for pending results... 15500 1727096227.84535: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15500 1727096227.84651: in run() - task 0afff68d-5257-877d-2da0-00000000004c 15500 1727096227.84674: variable 'ansible_search_path' from source: unknown 15500 1727096227.84688: variable 'ansible_search_path' from source: unknown 15500 1727096227.84730: calling self._execute() 15500 1727096227.84835: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.84847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.84859: variable 'omit' from source: magic vars 15500 1727096227.85253: variable 'ansible_distribution_major_version' from source: facts 15500 1727096227.85272: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096227.85283: variable 'omit' from source: magic vars 15500 1727096227.85322: variable 'omit' from source: magic vars 15500 1727096227.85370: variable 'omit' from source: magic vars 15500 1727096227.85416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096227.85463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096227.85489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096227.85511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096227.85525: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096227.85566: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096227.85577: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.85664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.85699: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096227.85710: Set connection var ansible_pipelining to False 15500 1727096227.85719: Set connection var ansible_timeout to 10 15500 1727096227.85726: Set connection var ansible_shell_type to sh 15500 1727096227.85736: Set connection var ansible_shell_executable to /bin/sh 15500 1727096227.85744: Set connection var ansible_connection to ssh 15500 1727096227.85779: variable 'ansible_shell_executable' from source: unknown 15500 1727096227.85787: variable 'ansible_connection' from source: unknown 15500 1727096227.85795: variable 'ansible_module_compression' from source: unknown 15500 1727096227.85801: variable 'ansible_shell_type' from source: unknown 15500 1727096227.85806: variable 'ansible_shell_executable' from source: unknown 15500 1727096227.85813: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.85820: variable 'ansible_pipelining' from source: unknown 15500 1727096227.85827: variable 'ansible_timeout' from source: unknown 15500 1727096227.85836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.86037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096227.86053: variable 'omit' from source: magic vars 15500 1727096227.86062: starting attempt loop 15500 1727096227.86100: running the handler 15500 1727096227.86290: variable '__network_connections_result' from source: set_fact 15500 1727096227.86753: handler run complete 15500 1727096227.86756: attempt loop complete, returning result 15500 1727096227.86758: _execute() done 15500 1727096227.86760: dumping result to json 15500 1727096227.86762: done dumping result, returning 15500 1727096227.86765: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-877d-2da0-00000000004c] 15500 1727096227.86767: sending task result for task 0afff68d-5257-877d-2da0-00000000004c 15500 1727096227.86831: done sending task result for task 0afff68d-5257-877d-2da0-00000000004c 15500 1727096227.86834: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15500 1727096227.86917: no more pending results, returning what we have 15500 1727096227.86921: results queue empty 15500 1727096227.86922: checking for any_errors_fatal 15500 1727096227.86928: done checking for any_errors_fatal 15500 1727096227.86929: checking for max_fail_percentage 15500 1727096227.86930: done checking for max_fail_percentage 15500 1727096227.86931: checking to see if all hosts have failed and the running result is not ok 15500 1727096227.86932: done checking to see if all hosts have failed 15500 1727096227.86933: getting the remaining hosts for this loop 15500 1727096227.86935: done getting the remaining hosts for this loop 15500 1727096227.86939: getting the next task for host managed_node1 15500 1727096227.86945: done getting next task for host managed_node1 15500 1727096227.86949: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15500 1727096227.86951: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096227.86961: getting variables 15500 1727096227.86963: in VariableManager get_vars() 15500 1727096227.87005: Calling all_inventory to load vars for managed_node1 15500 1727096227.87008: Calling groups_inventory to load vars for managed_node1 15500 1727096227.87010: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096227.87020: Calling all_plugins_play to load vars for managed_node1 15500 1727096227.87024: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096227.87026: Calling groups_plugins_play to load vars for managed_node1 15500 1727096227.91613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096227.96487: done with get_vars() 15500 1727096227.96522: done getting variables 15500 1727096227.96590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:57:07 -0400 (0:00:00.132) 0:00:28.009 ****** 15500 1727096227.96620: entering _queue_task() for managed_node1/debug 15500 1727096227.97387: worker is 1 (out of 1 available) 15500 1727096227.97400: exiting _queue_task() for managed_node1/debug 15500 1727096227.97411: done queuing things up, now waiting for results queue to drain 15500 1727096227.97412: waiting for pending results... 15500 1727096227.98199: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15500 1727096227.98204: in run() - task 0afff68d-5257-877d-2da0-00000000004d 15500 1727096227.98208: variable 'ansible_search_path' from source: unknown 15500 1727096227.98210: variable 'ansible_search_path' from source: unknown 15500 1727096227.98213: calling self._execute() 15500 1727096227.98323: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.98327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.98337: variable 'omit' from source: magic vars 15500 1727096227.98748: variable 'ansible_distribution_major_version' from source: facts 15500 1727096227.98758: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096227.98770: variable 'omit' from source: magic vars 15500 1727096227.98819: variable 'omit' from source: magic vars 15500 1727096227.98852: variable 'omit' from source: magic vars 15500 1727096227.98900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096227.99073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096227.99076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096227.99079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096227.99082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096227.99084: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096227.99086: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.99088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.99145: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096227.99150: Set connection var ansible_pipelining to False 15500 1727096227.99156: Set connection var ansible_timeout to 10 15500 1727096227.99159: Set connection var ansible_shell_type to sh 15500 1727096227.99167: Set connection var ansible_shell_executable to /bin/sh 15500 1727096227.99173: Set connection var ansible_connection to ssh 15500 1727096227.99291: variable 'ansible_shell_executable' from source: unknown 15500 1727096227.99294: variable 'ansible_connection' from source: unknown 15500 1727096227.99297: variable 'ansible_module_compression' from source: unknown 15500 1727096227.99299: variable 'ansible_shell_type' from source: unknown 15500 1727096227.99301: variable 'ansible_shell_executable' from source: unknown 15500 1727096227.99303: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096227.99305: variable 'ansible_pipelining' from source: unknown 15500 1727096227.99307: variable 'ansible_timeout' from source: unknown 15500 1727096227.99309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096227.99603: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096227.99613: variable 'omit' from source: magic vars 15500 1727096227.99620: starting attempt loop 15500 1727096227.99623: running the handler 15500 1727096227.99798: variable '__network_connections_result' from source: set_fact 15500 1727096227.99885: variable '__network_connections_result' from source: set_fact 15500 1727096228.00181: handler run complete 15500 1727096228.00206: attempt loop complete, returning result 15500 1727096228.00209: _execute() done 15500 1727096228.00212: dumping result to json 15500 1727096228.00214: done dumping result, returning 15500 1727096228.00311: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-877d-2da0-00000000004d] 15500 1727096228.00357: sending task result for task 0afff68d-5257-877d-2da0-00000000004d 15500 1727096228.00673: done sending task result for task 0afff68d-5257-877d-2da0-00000000004d 15500 1727096228.00677: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15500 1727096228.00758: no more pending results, returning what we have 15500 1727096228.00761: results queue empty 15500 1727096228.00762: checking for any_errors_fatal 15500 1727096228.00771: done checking for any_errors_fatal 15500 1727096228.00772: checking for max_fail_percentage 15500 1727096228.00774: done checking for max_fail_percentage 15500 1727096228.00775: checking to see if all hosts have failed and the running result is not ok 15500 1727096228.00775: done checking to see if all hosts have failed 15500 1727096228.00776: getting the remaining hosts for this loop 15500 1727096228.00778: done getting the remaining hosts for this loop 15500 1727096228.00781: getting the next task for host managed_node1 15500 1727096228.00788: done getting next task for host managed_node1 15500 1727096228.00791: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15500 1727096228.00793: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.00803: getting variables 15500 1727096228.00805: in VariableManager get_vars() 15500 1727096228.00839: Calling all_inventory to load vars for managed_node1 15500 1727096228.00842: Calling groups_inventory to load vars for managed_node1 15500 1727096228.00844: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.00853: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.00855: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.00858: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.03593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.06502: done with get_vars() 15500 1727096228.06538: done getting variables 15500 1727096228.06608: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:57:08 -0400 (0:00:00.100) 0:00:28.109 ****** 15500 1727096228.06653: entering _queue_task() for managed_node1/debug 15500 1727096228.07107: worker is 1 (out of 1 available) 15500 1727096228.07119: exiting _queue_task() for managed_node1/debug 15500 1727096228.07130: done queuing things up, now waiting for results queue to drain 15500 1727096228.07131: waiting for pending results... 15500 1727096228.07392: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15500 1727096228.07619: in run() - task 0afff68d-5257-877d-2da0-00000000004e 15500 1727096228.07623: variable 'ansible_search_path' from source: unknown 15500 1727096228.07626: variable 'ansible_search_path' from source: unknown 15500 1727096228.07628: calling self._execute() 15500 1727096228.07653: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096228.07660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096228.07781: variable 'omit' from source: magic vars 15500 1727096228.08084: variable 'ansible_distribution_major_version' from source: facts 15500 1727096228.08096: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096228.08228: variable 'network_state' from source: role '' defaults 15500 1727096228.08238: Evaluated conditional (network_state != {}): False 15500 1727096228.08241: when evaluation is False, skipping this task 15500 1727096228.08244: _execute() done 15500 1727096228.08247: dumping result to json 15500 1727096228.08249: done dumping result, returning 15500 1727096228.08265: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-877d-2da0-00000000004e] 15500 1727096228.08272: sending task result for task 0afff68d-5257-877d-2da0-00000000004e 15500 1727096228.08397: done sending task result for task 0afff68d-5257-877d-2da0-00000000004e 15500 1727096228.08401: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15500 1727096228.08450: no more pending results, returning what we have 15500 1727096228.08461: results queue empty 15500 1727096228.08462: checking for any_errors_fatal 15500 1727096228.08507: done checking for any_errors_fatal 15500 1727096228.08509: checking for max_fail_percentage 15500 1727096228.08511: done checking for max_fail_percentage 15500 1727096228.08511: checking to see if all hosts have failed and the running result is not ok 15500 1727096228.08512: done checking to see if all hosts have failed 15500 1727096228.08513: getting the remaining hosts for this loop 15500 1727096228.08515: done getting the remaining hosts for this loop 15500 1727096228.08519: getting the next task for host managed_node1 15500 1727096228.08526: done getting next task for host managed_node1 15500 1727096228.08531: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15500 1727096228.08534: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.08549: getting variables 15500 1727096228.08551: in VariableManager get_vars() 15500 1727096228.08599: Calling all_inventory to load vars for managed_node1 15500 1727096228.08601: Calling groups_inventory to load vars for managed_node1 15500 1727096228.08604: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.08789: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.08793: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.08797: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.10387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.12102: done with get_vars() 15500 1727096228.12128: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:57:08 -0400 (0:00:00.055) 0:00:28.165 ****** 15500 1727096228.12236: entering _queue_task() for managed_node1/ping 15500 1727096228.12600: worker is 1 (out of 1 available) 15500 1727096228.12620: exiting _queue_task() for managed_node1/ping 15500 1727096228.12634: done queuing things up, now waiting for results queue to drain 15500 1727096228.12636: waiting for pending results... 15500 1727096228.12899: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15500 1727096228.13086: in run() - task 0afff68d-5257-877d-2da0-00000000004f 15500 1727096228.13090: variable 'ansible_search_path' from source: unknown 15500 1727096228.13093: variable 'ansible_search_path' from source: unknown 15500 1727096228.13096: calling self._execute() 15500 1727096228.13171: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096228.13174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096228.13186: variable 'omit' from source: magic vars 15500 1727096228.13698: variable 'ansible_distribution_major_version' from source: facts 15500 1727096228.13702: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096228.13705: variable 'omit' from source: magic vars 15500 1727096228.13707: variable 'omit' from source: magic vars 15500 1727096228.13709: variable 'omit' from source: magic vars 15500 1727096228.13772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096228.13776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096228.13779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096228.13799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096228.13808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096228.13839: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096228.13842: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096228.13844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096228.13963: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096228.14225: Set connection var ansible_pipelining to False 15500 1727096228.14229: Set connection var ansible_timeout to 10 15500 1727096228.14235: Set connection var ansible_shell_type to sh 15500 1727096228.14237: Set connection var ansible_shell_executable to /bin/sh 15500 1727096228.14239: Set connection var ansible_connection to ssh 15500 1727096228.14241: variable 'ansible_shell_executable' from source: unknown 15500 1727096228.14243: variable 'ansible_connection' from source: unknown 15500 1727096228.14245: variable 'ansible_module_compression' from source: unknown 15500 1727096228.14247: variable 'ansible_shell_type' from source: unknown 15500 1727096228.14249: variable 'ansible_shell_executable' from source: unknown 15500 1727096228.14251: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096228.14253: variable 'ansible_pipelining' from source: unknown 15500 1727096228.14254: variable 'ansible_timeout' from source: unknown 15500 1727096228.14256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096228.14686: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096228.14691: variable 'omit' from source: magic vars 15500 1727096228.14694: starting attempt loop 15500 1727096228.14697: running the handler 15500 1727096228.14699: _low_level_execute_command(): starting 15500 1727096228.14701: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096228.15223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.15286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.15337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.15441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.17147: stdout chunk (state=3): >>>/root <<< 15500 1727096228.17241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.17277: stderr chunk (state=3): >>><<< 15500 1727096228.17281: stdout chunk (state=3): >>><<< 15500 1727096228.17304: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096228.17320: _low_level_execute_command(): starting 15500 1727096228.17328: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702 `" && echo ansible-tmp-1727096228.1730604-16650-243203668118702="` echo /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702 `" ) && sleep 0' 15500 1727096228.17941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096228.18000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096228.18013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096228.18035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096228.18047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096228.18266: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096228.18271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.18287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096228.18292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096228.18295: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15500 1727096228.18296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096228.18298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096228.18301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.18492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.18569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.20603: stdout chunk (state=3): >>>ansible-tmp-1727096228.1730604-16650-243203668118702=/root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702 <<< 15500 1727096228.21081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.21085: stdout chunk (state=3): >>><<< 15500 1727096228.21088: stderr chunk (state=3): >>><<< 15500 1727096228.21090: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096228.1730604-16650-243203668118702=/root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096228.21093: variable 'ansible_module_compression' from source: unknown 15500 1727096228.21095: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15500 1727096228.21307: variable 'ansible_facts' from source: unknown 15500 1727096228.21584: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/AnsiballZ_ping.py 15500 1727096228.21953: Sending initial data 15500 1727096228.21956: Sent initial data (153 bytes) 15500 1727096228.22752: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096228.22814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.22885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.22905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.23088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.23212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.24906: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096228.25044: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096228.25048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/AnsiballZ_ping.py" <<< 15500 1727096228.25051: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp8usp44l7 /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/AnsiballZ_ping.py <<< 15500 1727096228.25274: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp8usp44l7" to remote "/root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/AnsiballZ_ping.py" <<< 15500 1727096228.26111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.26219: stderr chunk (state=3): >>><<< 15500 1727096228.26222: stdout chunk (state=3): >>><<< 15500 1727096228.26224: done transferring module to remote 15500 1727096228.26227: _low_level_execute_command(): starting 15500 1727096228.26229: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/ /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/AnsiballZ_ping.py && sleep 0' 15500 1727096228.27104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.27171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.27221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.27554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.29427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.29437: stdout chunk (state=3): >>><<< 15500 1727096228.29449: stderr chunk (state=3): >>><<< 15500 1727096228.29480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096228.29488: _low_level_execute_command(): starting 15500 1727096228.29497: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/AnsiballZ_ping.py && sleep 0' 15500 1727096228.30075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096228.30110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.30195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.30204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.30284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.45818: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15500 1727096228.47348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096228.47352: stdout chunk (state=3): >>><<< 15500 1727096228.47354: stderr chunk (state=3): >>><<< 15500 1727096228.47711: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096228.47719: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096228.47726: _low_level_execute_command(): starting 15500 1727096228.47728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096228.1730604-16650-243203668118702/ > /dev/null 2>&1 && sleep 0' 15500 1727096228.48366: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096228.48479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.48719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.48844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.50691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.50747: stderr chunk (state=3): >>><<< 15500 1727096228.50750: stdout chunk (state=3): >>><<< 15500 1727096228.50773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096228.50782: handler run complete 15500 1727096228.50798: attempt loop complete, returning result 15500 1727096228.50801: _execute() done 15500 1727096228.50805: dumping result to json 15500 1727096228.50807: done dumping result, returning 15500 1727096228.50825: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-877d-2da0-00000000004f] 15500 1727096228.50828: sending task result for task 0afff68d-5257-877d-2da0-00000000004f ok: [managed_node1] => { "changed": false, "ping": "pong" } 15500 1727096228.50995: done sending task result for task 0afff68d-5257-877d-2da0-00000000004f 15500 1727096228.51017: WORKER PROCESS EXITING 15500 1727096228.51078: no more pending results, returning what we have 15500 1727096228.51084: results queue empty 15500 1727096228.51085: checking for any_errors_fatal 15500 1727096228.51091: done checking for any_errors_fatal 15500 1727096228.51091: checking for max_fail_percentage 15500 1727096228.51093: done checking for max_fail_percentage 15500 1727096228.51094: checking to see if all hosts have failed and the running result is not ok 15500 1727096228.51094: done checking to see if all hosts have failed 15500 1727096228.51095: getting the remaining hosts for this loop 15500 1727096228.51097: done getting the remaining hosts for this loop 15500 1727096228.51102: getting the next task for host managed_node1 15500 1727096228.51111: done getting next task for host managed_node1 15500 1727096228.51114: ^ task is: TASK: meta (role_complete) 15500 1727096228.51116: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.51125: getting variables 15500 1727096228.51127: in VariableManager get_vars() 15500 1727096228.51173: Calling all_inventory to load vars for managed_node1 15500 1727096228.51176: Calling groups_inventory to load vars for managed_node1 15500 1727096228.51178: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.51188: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.51190: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.51193: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.52584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.54275: done with get_vars() 15500 1727096228.54296: done getting variables 15500 1727096228.54355: done queuing things up, now waiting for results queue to drain 15500 1727096228.54359: results queue empty 15500 1727096228.54360: checking for any_errors_fatal 15500 1727096228.54362: done checking for any_errors_fatal 15500 1727096228.54362: checking for max_fail_percentage 15500 1727096228.54363: done checking for max_fail_percentage 15500 1727096228.54364: checking to see if all hosts have failed and the running result is not ok 15500 1727096228.54364: done checking to see if all hosts have failed 15500 1727096228.54365: getting the remaining hosts for this loop 15500 1727096228.54365: done getting the remaining hosts for this loop 15500 1727096228.54369: getting the next task for host managed_node1 15500 1727096228.54372: done getting next task for host managed_node1 15500 1727096228.54373: ^ task is: TASK: meta (flush_handlers) 15500 1727096228.54374: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.54376: getting variables 15500 1727096228.54377: in VariableManager get_vars() 15500 1727096228.54386: Calling all_inventory to load vars for managed_node1 15500 1727096228.54388: Calling groups_inventory to load vars for managed_node1 15500 1727096228.54389: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.54394: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.54396: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.54398: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.55478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.57303: done with get_vars() 15500 1727096228.57336: done getting variables 15500 1727096228.57403: in VariableManager get_vars() 15500 1727096228.57419: Calling all_inventory to load vars for managed_node1 15500 1727096228.57422: Calling groups_inventory to load vars for managed_node1 15500 1727096228.57424: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.57430: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.57432: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.57436: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.58573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.59808: done with get_vars() 15500 1727096228.59830: done queuing things up, now waiting for results queue to drain 15500 1727096228.59832: results queue empty 15500 1727096228.59833: checking for any_errors_fatal 15500 1727096228.59835: done checking for any_errors_fatal 15500 1727096228.59835: checking for max_fail_percentage 15500 1727096228.59836: done checking for max_fail_percentage 15500 1727096228.59836: checking to see if all hosts have failed and the running result is not ok 15500 1727096228.59837: done checking to see if all hosts have failed 15500 1727096228.59837: getting the remaining hosts for this loop 15500 1727096228.59838: done getting the remaining hosts for this loop 15500 1727096228.59840: getting the next task for host managed_node1 15500 1727096228.59843: done getting next task for host managed_node1 15500 1727096228.59847: ^ task is: TASK: meta (flush_handlers) 15500 1727096228.59849: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.59852: getting variables 15500 1727096228.59853: in VariableManager get_vars() 15500 1727096228.59866: Calling all_inventory to load vars for managed_node1 15500 1727096228.59870: Calling groups_inventory to load vars for managed_node1 15500 1727096228.59872: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.59877: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.59879: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.59882: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.60784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.61874: done with get_vars() 15500 1727096228.61892: done getting variables 15500 1727096228.61942: in VariableManager get_vars() 15500 1727096228.61951: Calling all_inventory to load vars for managed_node1 15500 1727096228.61953: Calling groups_inventory to load vars for managed_node1 15500 1727096228.61954: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.61957: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.61961: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.61963: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.62977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.64011: done with get_vars() 15500 1727096228.64032: done queuing things up, now waiting for results queue to drain 15500 1727096228.64033: results queue empty 15500 1727096228.64034: checking for any_errors_fatal 15500 1727096228.64035: done checking for any_errors_fatal 15500 1727096228.64036: checking for max_fail_percentage 15500 1727096228.64037: done checking for max_fail_percentage 15500 1727096228.64037: checking to see if all hosts have failed and the running result is not ok 15500 1727096228.64038: done checking to see if all hosts have failed 15500 1727096228.64039: getting the remaining hosts for this loop 15500 1727096228.64040: done getting the remaining hosts for this loop 15500 1727096228.64042: getting the next task for host managed_node1 15500 1727096228.64044: done getting next task for host managed_node1 15500 1727096228.64044: ^ task is: None 15500 1727096228.64045: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.64046: done queuing things up, now waiting for results queue to drain 15500 1727096228.64047: results queue empty 15500 1727096228.64047: checking for any_errors_fatal 15500 1727096228.64048: done checking for any_errors_fatal 15500 1727096228.64048: checking for max_fail_percentage 15500 1727096228.64049: done checking for max_fail_percentage 15500 1727096228.64049: checking to see if all hosts have failed and the running result is not ok 15500 1727096228.64049: done checking to see if all hosts have failed 15500 1727096228.64050: getting the next task for host managed_node1 15500 1727096228.64052: done getting next task for host managed_node1 15500 1727096228.64053: ^ task is: None 15500 1727096228.64054: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.64093: in VariableManager get_vars() 15500 1727096228.64106: done with get_vars() 15500 1727096228.64110: in VariableManager get_vars() 15500 1727096228.64116: done with get_vars() 15500 1727096228.64118: variable 'omit' from source: magic vars 15500 1727096228.64140: in VariableManager get_vars() 15500 1727096228.64147: done with get_vars() 15500 1727096228.64163: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15500 1727096228.64288: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096228.64309: getting the remaining hosts for this loop 15500 1727096228.64310: done getting the remaining hosts for this loop 15500 1727096228.64311: getting the next task for host managed_node1 15500 1727096228.64313: done getting next task for host managed_node1 15500 1727096228.64315: ^ task is: TASK: Gathering Facts 15500 1727096228.64316: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096228.64317: getting variables 15500 1727096228.64318: in VariableManager get_vars() 15500 1727096228.64323: Calling all_inventory to load vars for managed_node1 15500 1727096228.64325: Calling groups_inventory to load vars for managed_node1 15500 1727096228.64326: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096228.64330: Calling all_plugins_play to load vars for managed_node1 15500 1727096228.64332: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096228.64333: Calling groups_plugins_play to load vars for managed_node1 15500 1727096228.65066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096228.70795: done with get_vars() 15500 1727096228.70815: done getting variables 15500 1727096228.70849: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Monday 23 September 2024 08:57:08 -0400 (0:00:00.586) 0:00:28.751 ****** 15500 1727096228.70872: entering _queue_task() for managed_node1/gather_facts 15500 1727096228.71139: worker is 1 (out of 1 available) 15500 1727096228.71150: exiting _queue_task() for managed_node1/gather_facts 15500 1727096228.71166: done queuing things up, now waiting for results queue to drain 15500 1727096228.71169: waiting for pending results... 15500 1727096228.71340: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096228.71422: in run() - task 0afff68d-5257-877d-2da0-000000000382 15500 1727096228.71432: variable 'ansible_search_path' from source: unknown 15500 1727096228.71464: calling self._execute() 15500 1727096228.71533: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096228.71540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096228.71548: variable 'omit' from source: magic vars 15500 1727096228.71845: variable 'ansible_distribution_major_version' from source: facts 15500 1727096228.71855: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096228.71863: variable 'omit' from source: magic vars 15500 1727096228.71885: variable 'omit' from source: magic vars 15500 1727096228.71911: variable 'omit' from source: magic vars 15500 1727096228.71945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096228.71975: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096228.71992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096228.72005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096228.72013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096228.72036: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096228.72039: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096228.72043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096228.72119: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096228.72122: Set connection var ansible_pipelining to False 15500 1727096228.72128: Set connection var ansible_timeout to 10 15500 1727096228.72131: Set connection var ansible_shell_type to sh 15500 1727096228.72135: Set connection var ansible_shell_executable to /bin/sh 15500 1727096228.72140: Set connection var ansible_connection to ssh 15500 1727096228.72162: variable 'ansible_shell_executable' from source: unknown 15500 1727096228.72165: variable 'ansible_connection' from source: unknown 15500 1727096228.72169: variable 'ansible_module_compression' from source: unknown 15500 1727096228.72173: variable 'ansible_shell_type' from source: unknown 15500 1727096228.72175: variable 'ansible_shell_executable' from source: unknown 15500 1727096228.72177: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096228.72180: variable 'ansible_pipelining' from source: unknown 15500 1727096228.72186: variable 'ansible_timeout' from source: unknown 15500 1727096228.72188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096228.72321: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096228.72330: variable 'omit' from source: magic vars 15500 1727096228.72334: starting attempt loop 15500 1727096228.72338: running the handler 15500 1727096228.72350: variable 'ansible_facts' from source: unknown 15500 1727096228.72372: _low_level_execute_command(): starting 15500 1727096228.72445: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096228.73231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096228.73245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096228.73257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096228.73278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096228.73379: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.73400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.73414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.73564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.75294: stdout chunk (state=3): >>>/root <<< 15500 1727096228.75460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.75476: stdout chunk (state=3): >>><<< 15500 1727096228.75490: stderr chunk (state=3): >>><<< 15500 1727096228.75520: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096228.75586: _low_level_execute_command(): starting 15500 1727096228.75607: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954 `" && echo ansible-tmp-1727096228.7557123-16680-255238443337954="` echo /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954 `" ) && sleep 0' 15500 1727096228.76284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.76332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.76347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.76377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.76503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.78483: stdout chunk (state=3): >>>ansible-tmp-1727096228.7557123-16680-255238443337954=/root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954 <<< 15500 1727096228.78591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.78618: stderr chunk (state=3): >>><<< 15500 1727096228.78622: stdout chunk (state=3): >>><<< 15500 1727096228.78641: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096228.7557123-16680-255238443337954=/root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096228.78673: variable 'ansible_module_compression' from source: unknown 15500 1727096228.78715: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096228.78765: variable 'ansible_facts' from source: unknown 15500 1727096228.78898: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/AnsiballZ_setup.py 15500 1727096228.79021: Sending initial data 15500 1727096228.79024: Sent initial data (154 bytes) 15500 1727096228.79720: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.79797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.79827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.79842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.79938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.81565: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096228.81623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096228.81690: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpy554vy1s /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/AnsiballZ_setup.py <<< 15500 1727096228.81694: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/AnsiballZ_setup.py" <<< 15500 1727096228.81758: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpy554vy1s" to remote "/root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/AnsiballZ_setup.py" <<< 15500 1727096228.82934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.83101: stderr chunk (state=3): >>><<< 15500 1727096228.83104: stdout chunk (state=3): >>><<< 15500 1727096228.83107: done transferring module to remote 15500 1727096228.83109: _low_level_execute_command(): starting 15500 1727096228.83111: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/ /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/AnsiballZ_setup.py && sleep 0' 15500 1727096228.83687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096228.83703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096228.83725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096228.83745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096228.83762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096228.83839: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096228.83875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096228.83891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.83915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.84020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096228.85986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096228.85990: stdout chunk (state=3): >>><<< 15500 1727096228.85992: stderr chunk (state=3): >>><<< 15500 1727096228.86098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096228.86105: _low_level_execute_command(): starting 15500 1727096228.86108: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/AnsiballZ_setup.py && sleep 0' 15500 1727096228.86718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096228.86737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096228.86752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096228.86787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096228.86895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096228.86917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096228.87041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096229.51736: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fi<<< 15500 1727096229.51806: stdout chunk (state=3): >>>xed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 382, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797625856, "block_size": 4096, "block_total": 65519099, "block_available": 63915436, "block_used": 1603663, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "09", "epoch": "1727096229", "epoch_int": "1727096229", "date": "2024-09-23", "time": "08:57:09", "iso8601_micro": "2024-09-23T12:57:09.512978Z", "iso8601": "2024-09-23T12:57:09Z", "iso8601_basic": "20240923T085709512978", "iso8601_basic_short": "20240923T085709", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.4580078125, "5m": 0.330078125, "15m": 0.1552734375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096229.53858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096229.53863: stdout chunk (state=3): >>><<< 15500 1727096229.53866: stderr chunk (state=3): >>><<< 15500 1727096229.54074: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 382, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797625856, "block_size": 4096, "block_total": 65519099, "block_available": 63915436, "block_used": 1603663, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "09", "epoch": "1727096229", "epoch_int": "1727096229", "date": "2024-09-23", "time": "08:57:09", "iso8601_micro": "2024-09-23T12:57:09.512978Z", "iso8601": "2024-09-23T12:57:09Z", "iso8601_basic": "20240923T085709512978", "iso8601_basic_short": "20240923T085709", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.4580078125, "5m": 0.330078125, "15m": 0.1552734375}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096229.54266: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096229.54301: _low_level_execute_command(): starting 15500 1727096229.54311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096228.7557123-16680-255238443337954/ > /dev/null 2>&1 && sleep 0' 15500 1727096229.54924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096229.54943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096229.54958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096229.54981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096229.55000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096229.55048: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096229.55107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096229.55122: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096229.55270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096229.55369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096229.57314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096229.57341: stdout chunk (state=3): >>><<< 15500 1727096229.57352: stderr chunk (state=3): >>><<< 15500 1727096229.57376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096229.57390: handler run complete 15500 1727096229.57531: variable 'ansible_facts' from source: unknown 15500 1727096229.57647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096229.57954: variable 'ansible_facts' from source: unknown 15500 1727096229.58036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096229.58190: attempt loop complete, returning result 15500 1727096229.58193: _execute() done 15500 1727096229.58195: dumping result to json 15500 1727096229.58297: done dumping result, returning 15500 1727096229.58300: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-000000000382] 15500 1727096229.58303: sending task result for task 0afff68d-5257-877d-2da0-000000000382 15500 1727096229.58869: done sending task result for task 0afff68d-5257-877d-2da0-000000000382 15500 1727096229.58873: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096229.59247: no more pending results, returning what we have 15500 1727096229.59250: results queue empty 15500 1727096229.59251: checking for any_errors_fatal 15500 1727096229.59253: done checking for any_errors_fatal 15500 1727096229.59253: checking for max_fail_percentage 15500 1727096229.59255: done checking for max_fail_percentage 15500 1727096229.59256: checking to see if all hosts have failed and the running result is not ok 15500 1727096229.59257: done checking to see if all hosts have failed 15500 1727096229.59258: getting the remaining hosts for this loop 15500 1727096229.59259: done getting the remaining hosts for this loop 15500 1727096229.59263: getting the next task for host managed_node1 15500 1727096229.59270: done getting next task for host managed_node1 15500 1727096229.59272: ^ task is: TASK: meta (flush_handlers) 15500 1727096229.59274: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096229.59351: getting variables 15500 1727096229.59353: in VariableManager get_vars() 15500 1727096229.59378: Calling all_inventory to load vars for managed_node1 15500 1727096229.59382: Calling groups_inventory to load vars for managed_node1 15500 1727096229.59392: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096229.59403: Calling all_plugins_play to load vars for managed_node1 15500 1727096229.59406: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096229.59409: Calling groups_plugins_play to load vars for managed_node1 15500 1727096229.60751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096229.62391: done with get_vars() 15500 1727096229.62422: done getting variables 15500 1727096229.62508: in VariableManager get_vars() 15500 1727096229.62520: Calling all_inventory to load vars for managed_node1 15500 1727096229.62523: Calling groups_inventory to load vars for managed_node1 15500 1727096229.62525: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096229.62531: Calling all_plugins_play to load vars for managed_node1 15500 1727096229.62533: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096229.62543: Calling groups_plugins_play to load vars for managed_node1 15500 1727096229.63786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096229.65460: done with get_vars() 15500 1727096229.65504: done queuing things up, now waiting for results queue to drain 15500 1727096229.65507: results queue empty 15500 1727096229.65507: checking for any_errors_fatal 15500 1727096229.65517: done checking for any_errors_fatal 15500 1727096229.65518: checking for max_fail_percentage 15500 1727096229.65519: done checking for max_fail_percentage 15500 1727096229.65520: checking to see if all hosts have failed and the running result is not ok 15500 1727096229.65521: done checking to see if all hosts have failed 15500 1727096229.65521: getting the remaining hosts for this loop 15500 1727096229.65522: done getting the remaining hosts for this loop 15500 1727096229.65526: getting the next task for host managed_node1 15500 1727096229.65531: done getting next task for host managed_node1 15500 1727096229.65533: ^ task is: TASK: Include the task 'delete_interface.yml' 15500 1727096229.65535: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096229.65537: getting variables 15500 1727096229.65538: in VariableManager get_vars() 15500 1727096229.65548: Calling all_inventory to load vars for managed_node1 15500 1727096229.65550: Calling groups_inventory to load vars for managed_node1 15500 1727096229.65553: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096229.65558: Calling all_plugins_play to load vars for managed_node1 15500 1727096229.65561: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096229.65564: Calling groups_plugins_play to load vars for managed_node1 15500 1727096229.66752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096229.68417: done with get_vars() 15500 1727096229.68439: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Monday 23 September 2024 08:57:09 -0400 (0:00:00.976) 0:00:29.728 ****** 15500 1727096229.68528: entering _queue_task() for managed_node1/include_tasks 15500 1727096229.68966: worker is 1 (out of 1 available) 15500 1727096229.69096: exiting _queue_task() for managed_node1/include_tasks 15500 1727096229.69111: done queuing things up, now waiting for results queue to drain 15500 1727096229.69112: waiting for pending results... 15500 1727096229.69352: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 15500 1727096229.69532: in run() - task 0afff68d-5257-877d-2da0-000000000052 15500 1727096229.69537: variable 'ansible_search_path' from source: unknown 15500 1727096229.69540: calling self._execute() 15500 1727096229.69650: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096229.69673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096229.69690: variable 'omit' from source: magic vars 15500 1727096229.70138: variable 'ansible_distribution_major_version' from source: facts 15500 1727096229.70156: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096229.70170: _execute() done 15500 1727096229.70294: dumping result to json 15500 1727096229.70297: done dumping result, returning 15500 1727096229.70299: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0afff68d-5257-877d-2da0-000000000052] 15500 1727096229.70302: sending task result for task 0afff68d-5257-877d-2da0-000000000052 15500 1727096229.70386: done sending task result for task 0afff68d-5257-877d-2da0-000000000052 15500 1727096229.70389: WORKER PROCESS EXITING 15500 1727096229.70425: no more pending results, returning what we have 15500 1727096229.70430: in VariableManager get_vars() 15500 1727096229.70470: Calling all_inventory to load vars for managed_node1 15500 1727096229.70474: Calling groups_inventory to load vars for managed_node1 15500 1727096229.70478: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096229.70493: Calling all_plugins_play to load vars for managed_node1 15500 1727096229.70496: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096229.70500: Calling groups_plugins_play to load vars for managed_node1 15500 1727096229.72149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096229.73788: done with get_vars() 15500 1727096229.73818: variable 'ansible_search_path' from source: unknown 15500 1727096229.73835: we have included files to process 15500 1727096229.73836: generating all_blocks data 15500 1727096229.73837: done generating all_blocks data 15500 1727096229.73838: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15500 1727096229.73839: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15500 1727096229.73841: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15500 1727096229.74085: done processing included file 15500 1727096229.74087: iterating over new_blocks loaded from include file 15500 1727096229.74088: in VariableManager get_vars() 15500 1727096229.74100: done with get_vars() 15500 1727096229.74102: filtering new block on tags 15500 1727096229.74116: done filtering new block on tags 15500 1727096229.74118: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 15500 1727096229.74123: extending task lists for all hosts with included blocks 15500 1727096229.74160: done extending task lists 15500 1727096229.74162: done processing included files 15500 1727096229.74162: results queue empty 15500 1727096229.74163: checking for any_errors_fatal 15500 1727096229.74165: done checking for any_errors_fatal 15500 1727096229.74165: checking for max_fail_percentage 15500 1727096229.74166: done checking for max_fail_percentage 15500 1727096229.74169: checking to see if all hosts have failed and the running result is not ok 15500 1727096229.74170: done checking to see if all hosts have failed 15500 1727096229.74171: getting the remaining hosts for this loop 15500 1727096229.74172: done getting the remaining hosts for this loop 15500 1727096229.74174: getting the next task for host managed_node1 15500 1727096229.74178: done getting next task for host managed_node1 15500 1727096229.74181: ^ task is: TASK: Remove test interface if necessary 15500 1727096229.74183: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096229.74185: getting variables 15500 1727096229.74186: in VariableManager get_vars() 15500 1727096229.74195: Calling all_inventory to load vars for managed_node1 15500 1727096229.74198: Calling groups_inventory to load vars for managed_node1 15500 1727096229.74200: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096229.74205: Calling all_plugins_play to load vars for managed_node1 15500 1727096229.74208: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096229.74210: Calling groups_plugins_play to load vars for managed_node1 15500 1727096229.75527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096229.77176: done with get_vars() 15500 1727096229.77209: done getting variables 15500 1727096229.77250: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Monday 23 September 2024 08:57:09 -0400 (0:00:00.087) 0:00:29.815 ****** 15500 1727096229.77280: entering _queue_task() for managed_node1/command 15500 1727096229.77650: worker is 1 (out of 1 available) 15500 1727096229.77664: exiting _queue_task() for managed_node1/command 15500 1727096229.77679: done queuing things up, now waiting for results queue to drain 15500 1727096229.77680: waiting for pending results... 15500 1727096229.78080: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 15500 1727096229.78085: in run() - task 0afff68d-5257-877d-2da0-000000000393 15500 1727096229.78089: variable 'ansible_search_path' from source: unknown 15500 1727096229.78092: variable 'ansible_search_path' from source: unknown 15500 1727096229.78126: calling self._execute() 15500 1727096229.78229: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096229.78243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096229.78258: variable 'omit' from source: magic vars 15500 1727096229.78679: variable 'ansible_distribution_major_version' from source: facts 15500 1727096229.78696: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096229.78720: variable 'omit' from source: magic vars 15500 1727096229.78763: variable 'omit' from source: magic vars 15500 1727096229.78931: variable 'interface' from source: set_fact 15500 1727096229.78935: variable 'omit' from source: magic vars 15500 1727096229.78945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096229.78983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096229.79006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096229.79024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096229.79047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096229.79080: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096229.79088: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096229.79094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096229.79201: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096229.79212: Set connection var ansible_pipelining to False 15500 1727096229.79222: Set connection var ansible_timeout to 10 15500 1727096229.79258: Set connection var ansible_shell_type to sh 15500 1727096229.79261: Set connection var ansible_shell_executable to /bin/sh 15500 1727096229.79264: Set connection var ansible_connection to ssh 15500 1727096229.79282: variable 'ansible_shell_executable' from source: unknown 15500 1727096229.79290: variable 'ansible_connection' from source: unknown 15500 1727096229.79297: variable 'ansible_module_compression' from source: unknown 15500 1727096229.79369: variable 'ansible_shell_type' from source: unknown 15500 1727096229.79372: variable 'ansible_shell_executable' from source: unknown 15500 1727096229.79374: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096229.79376: variable 'ansible_pipelining' from source: unknown 15500 1727096229.79378: variable 'ansible_timeout' from source: unknown 15500 1727096229.79380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096229.79483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096229.79498: variable 'omit' from source: magic vars 15500 1727096229.79507: starting attempt loop 15500 1727096229.79514: running the handler 15500 1727096229.79532: _low_level_execute_command(): starting 15500 1727096229.79544: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096229.80288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096229.80376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096229.80428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096229.80447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096229.80487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096229.80613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096229.82373: stdout chunk (state=3): >>>/root <<< 15500 1727096229.82544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096229.82550: stdout chunk (state=3): >>><<< 15500 1727096229.82554: stderr chunk (state=3): >>><<< 15500 1727096229.82683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096229.82687: _low_level_execute_command(): starting 15500 1727096229.82690: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782 `" && echo ansible-tmp-1727096229.825872-16718-130827100329782="` echo /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782 `" ) && sleep 0' 15500 1727096229.83274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096229.83288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096229.83311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096229.83428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096229.83458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096229.83571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096229.85582: stdout chunk (state=3): >>>ansible-tmp-1727096229.825872-16718-130827100329782=/root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782 <<< 15500 1727096229.85674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096229.85881: stderr chunk (state=3): >>><<< 15500 1727096229.85885: stdout chunk (state=3): >>><<< 15500 1727096229.85887: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096229.825872-16718-130827100329782=/root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096229.85890: variable 'ansible_module_compression' from source: unknown 15500 1727096229.85892: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15500 1727096229.85894: variable 'ansible_facts' from source: unknown 15500 1727096229.85941: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/AnsiballZ_command.py 15500 1727096229.86173: Sending initial data 15500 1727096229.86179: Sent initial data (155 bytes) 15500 1727096229.86671: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096229.86680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096229.86690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096229.86789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096229.86793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096229.86805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096229.86825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096229.86918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096229.88655: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096229.88762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096229.88859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpk05c6n42 /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/AnsiballZ_command.py <<< 15500 1727096229.88863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/AnsiballZ_command.py" <<< 15500 1727096229.88962: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpk05c6n42" to remote "/root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/AnsiballZ_command.py" <<< 15500 1727096229.88989: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/AnsiballZ_command.py" <<< 15500 1727096229.89886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096229.89920: stderr chunk (state=3): >>><<< 15500 1727096229.90044: stdout chunk (state=3): >>><<< 15500 1727096229.90047: done transferring module to remote 15500 1727096229.90050: _low_level_execute_command(): starting 15500 1727096229.90052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/ /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/AnsiballZ_command.py && sleep 0' 15500 1727096229.90714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096229.90783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096229.90841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096229.90856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096229.90876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096229.90971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096229.92909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096229.92938: stdout chunk (state=3): >>><<< 15500 1727096229.92949: stderr chunk (state=3): >>><<< 15500 1727096229.93063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096229.93067: _low_level_execute_command(): starting 15500 1727096229.93081: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/AnsiballZ_command.py && sleep 0' 15500 1727096229.93714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096229.93730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096229.93832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096229.93853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096229.93884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096229.93991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096230.10537: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-23 08:57:10.096266", "end": "2024-09-23 08:57:10.103840", "delta": "0:00:00.007574", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15500 1727096230.12355: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. <<< 15500 1727096230.12551: stderr chunk (state=3): >>><<< 15500 1727096230.12554: stdout chunk (state=3): >>><<< 15500 1727096230.12557: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-23 08:57:10.096266", "end": "2024-09-23 08:57:10.103840", "delta": "0:00:00.007574", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. 15500 1727096230.12562: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096230.12564: _low_level_execute_command(): starting 15500 1727096230.12566: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096229.825872-16718-130827100329782/ > /dev/null 2>&1 && sleep 0' 15500 1727096230.13424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096230.13429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.13433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096230.13436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.13594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096230.13614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096230.13907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096230.15834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096230.15839: stdout chunk (state=3): >>><<< 15500 1727096230.15844: stderr chunk (state=3): >>><<< 15500 1727096230.15864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096230.15871: handler run complete 15500 1727096230.15909: Evaluated conditional (False): False 15500 1727096230.15913: attempt loop complete, returning result 15500 1727096230.15915: _execute() done 15500 1727096230.15917: dumping result to json 15500 1727096230.15919: done dumping result, returning 15500 1727096230.15972: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0afff68d-5257-877d-2da0-000000000393] 15500 1727096230.15975: sending task result for task 0afff68d-5257-877d-2da0-000000000393 fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007574", "end": "2024-09-23 08:57:10.103840", "rc": 1, "start": "2024-09-23 08:57:10.096266" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 15500 1727096230.16208: no more pending results, returning what we have 15500 1727096230.16212: results queue empty 15500 1727096230.16212: checking for any_errors_fatal 15500 1727096230.16215: done checking for any_errors_fatal 15500 1727096230.16215: checking for max_fail_percentage 15500 1727096230.16217: done checking for max_fail_percentage 15500 1727096230.16218: checking to see if all hosts have failed and the running result is not ok 15500 1727096230.16219: done checking to see if all hosts have failed 15500 1727096230.16219: getting the remaining hosts for this loop 15500 1727096230.16273: done getting the remaining hosts for this loop 15500 1727096230.16278: getting the next task for host managed_node1 15500 1727096230.16285: done getting next task for host managed_node1 15500 1727096230.16287: ^ task is: TASK: meta (flush_handlers) 15500 1727096230.16290: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096230.16294: getting variables 15500 1727096230.16296: in VariableManager get_vars() 15500 1727096230.16324: Calling all_inventory to load vars for managed_node1 15500 1727096230.16327: Calling groups_inventory to load vars for managed_node1 15500 1727096230.16445: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096230.16460: Calling all_plugins_play to load vars for managed_node1 15500 1727096230.16465: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096230.16470: Calling groups_plugins_play to load vars for managed_node1 15500 1727096230.17044: done sending task result for task 0afff68d-5257-877d-2da0-000000000393 15500 1727096230.17048: WORKER PROCESS EXITING 15500 1727096230.19374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096230.22971: done with get_vars() 15500 1727096230.23006: done getting variables 15500 1727096230.23202: in VariableManager get_vars() 15500 1727096230.23214: Calling all_inventory to load vars for managed_node1 15500 1727096230.23216: Calling groups_inventory to load vars for managed_node1 15500 1727096230.23218: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096230.23224: Calling all_plugins_play to load vars for managed_node1 15500 1727096230.23226: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096230.23229: Calling groups_plugins_play to load vars for managed_node1 15500 1727096230.26294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096230.28720: done with get_vars() 15500 1727096230.28763: done queuing things up, now waiting for results queue to drain 15500 1727096230.28765: results queue empty 15500 1727096230.28766: checking for any_errors_fatal 15500 1727096230.28772: done checking for any_errors_fatal 15500 1727096230.28773: checking for max_fail_percentage 15500 1727096230.28774: done checking for max_fail_percentage 15500 1727096230.28775: checking to see if all hosts have failed and the running result is not ok 15500 1727096230.28775: done checking to see if all hosts have failed 15500 1727096230.28776: getting the remaining hosts for this loop 15500 1727096230.28777: done getting the remaining hosts for this loop 15500 1727096230.28780: getting the next task for host managed_node1 15500 1727096230.28784: done getting next task for host managed_node1 15500 1727096230.28865: ^ task is: TASK: meta (flush_handlers) 15500 1727096230.28870: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096230.28874: getting variables 15500 1727096230.28875: in VariableManager get_vars() 15500 1727096230.28886: Calling all_inventory to load vars for managed_node1 15500 1727096230.28888: Calling groups_inventory to load vars for managed_node1 15500 1727096230.28890: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096230.28918: Calling all_plugins_play to load vars for managed_node1 15500 1727096230.28922: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096230.28925: Calling groups_plugins_play to load vars for managed_node1 15500 1727096230.30205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096230.33366: done with get_vars() 15500 1727096230.33397: done getting variables 15500 1727096230.33579: in VariableManager get_vars() 15500 1727096230.33591: Calling all_inventory to load vars for managed_node1 15500 1727096230.33593: Calling groups_inventory to load vars for managed_node1 15500 1727096230.33596: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096230.33601: Calling all_plugins_play to load vars for managed_node1 15500 1727096230.33603: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096230.33606: Calling groups_plugins_play to load vars for managed_node1 15500 1727096230.36828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096230.38810: done with get_vars() 15500 1727096230.38845: done queuing things up, now waiting for results queue to drain 15500 1727096230.38848: results queue empty 15500 1727096230.38848: checking for any_errors_fatal 15500 1727096230.38850: done checking for any_errors_fatal 15500 1727096230.38851: checking for max_fail_percentage 15500 1727096230.38852: done checking for max_fail_percentage 15500 1727096230.38852: checking to see if all hosts have failed and the running result is not ok 15500 1727096230.38853: done checking to see if all hosts have failed 15500 1727096230.38854: getting the remaining hosts for this loop 15500 1727096230.38855: done getting the remaining hosts for this loop 15500 1727096230.38857: getting the next task for host managed_node1 15500 1727096230.38861: done getting next task for host managed_node1 15500 1727096230.38862: ^ task is: None 15500 1727096230.38863: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096230.38864: done queuing things up, now waiting for results queue to drain 15500 1727096230.38865: results queue empty 15500 1727096230.38866: checking for any_errors_fatal 15500 1727096230.38866: done checking for any_errors_fatal 15500 1727096230.38872: checking for max_fail_percentage 15500 1727096230.38873: done checking for max_fail_percentage 15500 1727096230.38873: checking to see if all hosts have failed and the running result is not ok 15500 1727096230.38874: done checking to see if all hosts have failed 15500 1727096230.38875: getting the next task for host managed_node1 15500 1727096230.38878: done getting next task for host managed_node1 15500 1727096230.38879: ^ task is: None 15500 1727096230.38880: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096230.38989: in VariableManager get_vars() 15500 1727096230.39013: done with get_vars() 15500 1727096230.39020: in VariableManager get_vars() 15500 1727096230.39033: done with get_vars() 15500 1727096230.39039: variable 'omit' from source: magic vars 15500 1727096230.39384: variable 'profile' from source: play vars 15500 1727096230.39574: in VariableManager get_vars() 15500 1727096230.39589: done with get_vars() 15500 1727096230.39616: variable 'omit' from source: magic vars 15500 1727096230.39785: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15500 1727096230.41216: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096230.41477: getting the remaining hosts for this loop 15500 1727096230.41479: done getting the remaining hosts for this loop 15500 1727096230.41482: getting the next task for host managed_node1 15500 1727096230.41485: done getting next task for host managed_node1 15500 1727096230.41487: ^ task is: TASK: Gathering Facts 15500 1727096230.41489: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096230.41491: getting variables 15500 1727096230.41492: in VariableManager get_vars() 15500 1727096230.41505: Calling all_inventory to load vars for managed_node1 15500 1727096230.41508: Calling groups_inventory to load vars for managed_node1 15500 1727096230.41510: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096230.41516: Calling all_plugins_play to load vars for managed_node1 15500 1727096230.41518: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096230.41521: Calling groups_plugins_play to load vars for managed_node1 15500 1727096230.44040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096230.47215: done with get_vars() 15500 1727096230.47241: done getting variables 15500 1727096230.47292: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Monday 23 September 2024 08:57:10 -0400 (0:00:00.700) 0:00:30.516 ****** 15500 1727096230.47318: entering _queue_task() for managed_node1/gather_facts 15500 1727096230.48302: worker is 1 (out of 1 available) 15500 1727096230.48315: exiting _queue_task() for managed_node1/gather_facts 15500 1727096230.48328: done queuing things up, now waiting for results queue to drain 15500 1727096230.48329: waiting for pending results... 15500 1727096230.49242: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096230.49326: in run() - task 0afff68d-5257-877d-2da0-0000000003a1 15500 1727096230.49475: variable 'ansible_search_path' from source: unknown 15500 1727096230.49525: calling self._execute() 15500 1727096230.49876: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096230.49882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096230.49885: variable 'omit' from source: magic vars 15500 1727096230.51013: variable 'ansible_distribution_major_version' from source: facts 15500 1727096230.51173: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096230.51179: variable 'omit' from source: magic vars 15500 1727096230.51182: variable 'omit' from source: magic vars 15500 1727096230.51323: variable 'omit' from source: magic vars 15500 1727096230.51625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096230.51633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096230.51635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096230.51754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096230.51767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096230.51871: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096230.51875: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096230.51878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096230.52211: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096230.52216: Set connection var ansible_pipelining to False 15500 1727096230.52221: Set connection var ansible_timeout to 10 15500 1727096230.52224: Set connection var ansible_shell_type to sh 15500 1727096230.52229: Set connection var ansible_shell_executable to /bin/sh 15500 1727096230.52235: Set connection var ansible_connection to ssh 15500 1727096230.52255: variable 'ansible_shell_executable' from source: unknown 15500 1727096230.52261: variable 'ansible_connection' from source: unknown 15500 1727096230.52263: variable 'ansible_module_compression' from source: unknown 15500 1727096230.52266: variable 'ansible_shell_type' from source: unknown 15500 1727096230.52270: variable 'ansible_shell_executable' from source: unknown 15500 1727096230.52272: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096230.52683: variable 'ansible_pipelining' from source: unknown 15500 1727096230.52686: variable 'ansible_timeout' from source: unknown 15500 1727096230.52689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096230.53123: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096230.53135: variable 'omit' from source: magic vars 15500 1727096230.53140: starting attempt loop 15500 1727096230.53143: running the handler 15500 1727096230.53163: variable 'ansible_facts' from source: unknown 15500 1727096230.53185: _low_level_execute_command(): starting 15500 1727096230.53193: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096230.55076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.55082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096230.55187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096230.55381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096230.57145: stdout chunk (state=3): >>>/root <<< 15500 1727096230.57489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096230.57524: stderr chunk (state=3): >>><<< 15500 1727096230.57601: stdout chunk (state=3): >>><<< 15500 1727096230.57635: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096230.57694: _low_level_execute_command(): starting 15500 1727096230.57704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228 `" && echo ansible-tmp-1727096230.5768054-16752-20418769652228="` echo /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228 `" ) && sleep 0' 15500 1727096230.58521: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096230.58556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096230.58610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.58709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096230.58712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096230.58732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096230.58822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096230.60997: stdout chunk (state=3): >>>ansible-tmp-1727096230.5768054-16752-20418769652228=/root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228 <<< 15500 1727096230.61001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096230.61018: stderr chunk (state=3): >>><<< 15500 1727096230.61032: stdout chunk (state=3): >>><<< 15500 1727096230.61091: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096230.5768054-16752-20418769652228=/root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096230.61212: variable 'ansible_module_compression' from source: unknown 15500 1727096230.61295: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096230.61425: variable 'ansible_facts' from source: unknown 15500 1727096230.62004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/AnsiballZ_setup.py 15500 1727096230.62231: Sending initial data 15500 1727096230.62243: Sent initial data (153 bytes) 15500 1727096230.62900: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096230.62995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.63030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096230.63047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096230.63064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096230.63184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096230.65182: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096230.65678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096230.65717: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp9suvb6xm /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/AnsiballZ_setup.py <<< 15500 1727096230.65720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/AnsiballZ_setup.py" <<< 15500 1727096230.65776: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp9suvb6xm" to remote "/root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/AnsiballZ_setup.py" <<< 15500 1727096230.68623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096230.68710: stderr chunk (state=3): >>><<< 15500 1727096230.68724: stdout chunk (state=3): >>><<< 15500 1727096230.68750: done transferring module to remote 15500 1727096230.68770: _low_level_execute_command(): starting 15500 1727096230.68788: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/ /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/AnsiballZ_setup.py && sleep 0' 15500 1727096230.69810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096230.69814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096230.69980: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.70029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096230.70046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096230.70090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096230.70259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096230.72373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096230.72380: stdout chunk (state=3): >>><<< 15500 1727096230.72382: stderr chunk (state=3): >>><<< 15500 1727096230.72384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096230.72387: _low_level_execute_command(): starting 15500 1727096230.72389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/AnsiballZ_setup.py && sleep 0' 15500 1727096230.73238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096230.73242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.73244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096230.73246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096230.73320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096230.73334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096230.73449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096231.39064: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "11", "epoch": "1727096231", "epoch_int": "1727096231", "date": "2024-09-23", "time": "08:57:11", "iso8601_micro": "2024-09-23T12:57:11.007464Z", "iso8601": "2024-09-23T12:57:11Z", "iso8601_basic": "20240923T085711007464", "iso8601_basic_short": "20240923T085711", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 384, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797625856, "block_size": 4096, "block_total": 65519099, "block_available": 63915436, "block_used": 1603663, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.4580078125, "5m": 0.330078125, "15m": 0.1552734375}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096231.41145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096231.41148: stdout chunk (state=3): >>><<< 15500 1727096231.41150: stderr chunk (state=3): >>><<< 15500 1727096231.41174: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "11", "epoch": "1727096231", "epoch_int": "1727096231", "date": "2024-09-23", "time": "08:57:11", "iso8601_micro": "2024-09-23T12:57:11.007464Z", "iso8601": "2024-09-23T12:57:11Z", "iso8601_basic": "20240923T085711007464", "iso8601_basic_short": "20240923T085711", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 384, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797625856, "block_size": 4096, "block_total": 65519099, "block_available": 63915436, "block_used": 1603663, "inode_total": 131070960, "inode_available": 131029099, "inode_used": 41861, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.4580078125, "5m": 0.330078125, "15m": 0.1552734375}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096231.41905: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096231.42008: _low_level_execute_command(): starting 15500 1727096231.42011: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096230.5768054-16752-20418769652228/ > /dev/null 2>&1 && sleep 0' 15500 1727096231.43325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096231.43370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096231.43391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096231.43412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096231.43634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096231.43647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096231.43973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096231.45914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096231.46175: stderr chunk (state=3): >>><<< 15500 1727096231.46178: stdout chunk (state=3): >>><<< 15500 1727096231.46181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096231.46184: handler run complete 15500 1727096231.46421: variable 'ansible_facts' from source: unknown 15500 1727096231.46648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096231.47271: variable 'ansible_facts' from source: unknown 15500 1727096231.47437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096231.47848: attempt loop complete, returning result 15500 1727096231.47851: _execute() done 15500 1727096231.47854: dumping result to json 15500 1727096231.47856: done dumping result, returning 15500 1727096231.47860: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-0000000003a1] 15500 1727096231.47863: sending task result for task 0afff68d-5257-877d-2da0-0000000003a1 15500 1727096231.49173: done sending task result for task 0afff68d-5257-877d-2da0-0000000003a1 15500 1727096231.49178: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096231.49636: no more pending results, returning what we have 15500 1727096231.49639: results queue empty 15500 1727096231.49640: checking for any_errors_fatal 15500 1727096231.49642: done checking for any_errors_fatal 15500 1727096231.49642: checking for max_fail_percentage 15500 1727096231.49644: done checking for max_fail_percentage 15500 1727096231.49645: checking to see if all hosts have failed and the running result is not ok 15500 1727096231.49646: done checking to see if all hosts have failed 15500 1727096231.49646: getting the remaining hosts for this loop 15500 1727096231.49648: done getting the remaining hosts for this loop 15500 1727096231.49652: getting the next task for host managed_node1 15500 1727096231.49657: done getting next task for host managed_node1 15500 1727096231.49660: ^ task is: TASK: meta (flush_handlers) 15500 1727096231.49662: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096231.49666: getting variables 15500 1727096231.49669: in VariableManager get_vars() 15500 1727096231.49699: Calling all_inventory to load vars for managed_node1 15500 1727096231.49701: Calling groups_inventory to load vars for managed_node1 15500 1727096231.49704: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096231.49713: Calling all_plugins_play to load vars for managed_node1 15500 1727096231.49716: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096231.49719: Calling groups_plugins_play to load vars for managed_node1 15500 1727096231.52617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096231.55007: done with get_vars() 15500 1727096231.55042: done getting variables 15500 1727096231.55125: in VariableManager get_vars() 15500 1727096231.55140: Calling all_inventory to load vars for managed_node1 15500 1727096231.55143: Calling groups_inventory to load vars for managed_node1 15500 1727096231.55145: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096231.55150: Calling all_plugins_play to load vars for managed_node1 15500 1727096231.55152: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096231.55155: Calling groups_plugins_play to load vars for managed_node1 15500 1727096231.56566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096231.58273: done with get_vars() 15500 1727096231.58304: done queuing things up, now waiting for results queue to drain 15500 1727096231.58307: results queue empty 15500 1727096231.58307: checking for any_errors_fatal 15500 1727096231.58311: done checking for any_errors_fatal 15500 1727096231.58312: checking for max_fail_percentage 15500 1727096231.58313: done checking for max_fail_percentage 15500 1727096231.58318: checking to see if all hosts have failed and the running result is not ok 15500 1727096231.58319: done checking to see if all hosts have failed 15500 1727096231.58319: getting the remaining hosts for this loop 15500 1727096231.58320: done getting the remaining hosts for this loop 15500 1727096231.58323: getting the next task for host managed_node1 15500 1727096231.58327: done getting next task for host managed_node1 15500 1727096231.58330: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15500 1727096231.58332: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096231.58341: getting variables 15500 1727096231.58343: in VariableManager get_vars() 15500 1727096231.58364: Calling all_inventory to load vars for managed_node1 15500 1727096231.58366: Calling groups_inventory to load vars for managed_node1 15500 1727096231.58370: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096231.58376: Calling all_plugins_play to load vars for managed_node1 15500 1727096231.58378: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096231.58381: Calling groups_plugins_play to load vars for managed_node1 15500 1727096231.59827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096231.62896: done with get_vars() 15500 1727096231.62921: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:57:11 -0400 (0:00:01.156) 0:00:31.673 ****** 15500 1727096231.63002: entering _queue_task() for managed_node1/include_tasks 15500 1727096231.63506: worker is 1 (out of 1 available) 15500 1727096231.63519: exiting _queue_task() for managed_node1/include_tasks 15500 1727096231.63530: done queuing things up, now waiting for results queue to drain 15500 1727096231.63531: waiting for pending results... 15500 1727096231.63765: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15500 1727096231.63896: in run() - task 0afff68d-5257-877d-2da0-00000000005a 15500 1727096231.63917: variable 'ansible_search_path' from source: unknown 15500 1727096231.63924: variable 'ansible_search_path' from source: unknown 15500 1727096231.63966: calling self._execute() 15500 1727096231.64177: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096231.64189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096231.64208: variable 'omit' from source: magic vars 15500 1727096231.64708: variable 'ansible_distribution_major_version' from source: facts 15500 1727096231.64725: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096231.64738: _execute() done 15500 1727096231.64747: dumping result to json 15500 1727096231.65021: done dumping result, returning 15500 1727096231.65025: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-877d-2da0-00000000005a] 15500 1727096231.65028: sending task result for task 0afff68d-5257-877d-2da0-00000000005a 15500 1727096231.65164: no more pending results, returning what we have 15500 1727096231.65172: in VariableManager get_vars() 15500 1727096231.65219: Calling all_inventory to load vars for managed_node1 15500 1727096231.65222: Calling groups_inventory to load vars for managed_node1 15500 1727096231.65225: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096231.65237: Calling all_plugins_play to load vars for managed_node1 15500 1727096231.65241: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096231.65243: Calling groups_plugins_play to load vars for managed_node1 15500 1727096231.67275: done sending task result for task 0afff68d-5257-877d-2da0-00000000005a 15500 1727096231.67280: WORKER PROCESS EXITING 15500 1727096231.68811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096231.73574: done with get_vars() 15500 1727096231.73605: variable 'ansible_search_path' from source: unknown 15500 1727096231.73607: variable 'ansible_search_path' from source: unknown 15500 1727096231.73637: we have included files to process 15500 1727096231.73638: generating all_blocks data 15500 1727096231.73639: done generating all_blocks data 15500 1727096231.73640: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096231.73641: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096231.73644: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15500 1727096231.75529: done processing included file 15500 1727096231.75532: iterating over new_blocks loaded from include file 15500 1727096231.75533: in VariableManager get_vars() 15500 1727096231.75557: done with get_vars() 15500 1727096231.75559: filtering new block on tags 15500 1727096231.75880: done filtering new block on tags 15500 1727096231.75884: in VariableManager get_vars() 15500 1727096231.75906: done with get_vars() 15500 1727096231.75908: filtering new block on tags 15500 1727096231.75927: done filtering new block on tags 15500 1727096231.75929: in VariableManager get_vars() 15500 1727096231.75949: done with get_vars() 15500 1727096231.75950: filtering new block on tags 15500 1727096231.75969: done filtering new block on tags 15500 1727096231.75971: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15500 1727096231.75977: extending task lists for all hosts with included blocks 15500 1727096231.77163: done extending task lists 15500 1727096231.77164: done processing included files 15500 1727096231.77165: results queue empty 15500 1727096231.77166: checking for any_errors_fatal 15500 1727096231.77169: done checking for any_errors_fatal 15500 1727096231.77170: checking for max_fail_percentage 15500 1727096231.77171: done checking for max_fail_percentage 15500 1727096231.77171: checking to see if all hosts have failed and the running result is not ok 15500 1727096231.77172: done checking to see if all hosts have failed 15500 1727096231.77173: getting the remaining hosts for this loop 15500 1727096231.77174: done getting the remaining hosts for this loop 15500 1727096231.77177: getting the next task for host managed_node1 15500 1727096231.77181: done getting next task for host managed_node1 15500 1727096231.77184: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15500 1727096231.77186: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096231.77195: getting variables 15500 1727096231.77196: in VariableManager get_vars() 15500 1727096231.77211: Calling all_inventory to load vars for managed_node1 15500 1727096231.77214: Calling groups_inventory to load vars for managed_node1 15500 1727096231.77216: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096231.77222: Calling all_plugins_play to load vars for managed_node1 15500 1727096231.77224: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096231.77227: Calling groups_plugins_play to load vars for managed_node1 15500 1727096231.81423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096231.85087: done with get_vars() 15500 1727096231.85113: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:57:11 -0400 (0:00:00.221) 0:00:31.894 ****** 15500 1727096231.85184: entering _queue_task() for managed_node1/setup 15500 1727096231.86402: worker is 1 (out of 1 available) 15500 1727096231.86411: exiting _queue_task() for managed_node1/setup 15500 1727096231.86421: done queuing things up, now waiting for results queue to drain 15500 1727096231.86422: waiting for pending results... 15500 1727096231.86545: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15500 1727096231.87074: in run() - task 0afff68d-5257-877d-2da0-0000000003e2 15500 1727096231.87078: variable 'ansible_search_path' from source: unknown 15500 1727096231.87081: variable 'ansible_search_path' from source: unknown 15500 1727096231.87084: calling self._execute() 15500 1727096231.87087: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096231.87090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096231.87093: variable 'omit' from source: magic vars 15500 1727096231.87807: variable 'ansible_distribution_major_version' from source: facts 15500 1727096231.88073: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096231.88202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096231.92876: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096231.92880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096231.92884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096231.92886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096231.92889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096231.93143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096231.93183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096231.93242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096231.93487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096231.93517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096231.93579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096231.93611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096231.93650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096231.93817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096231.93849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096231.94125: variable '__network_required_facts' from source: role '' defaults 15500 1727096231.94472: variable 'ansible_facts' from source: unknown 15500 1727096231.95755: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15500 1727096231.95770: when evaluation is False, skipping this task 15500 1727096231.95778: _execute() done 15500 1727096231.95785: dumping result to json 15500 1727096231.95792: done dumping result, returning 15500 1727096231.95806: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-877d-2da0-0000000003e2] 15500 1727096231.95819: sending task result for task 0afff68d-5257-877d-2da0-0000000003e2 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096231.95969: no more pending results, returning what we have 15500 1727096231.95976: results queue empty 15500 1727096231.95977: checking for any_errors_fatal 15500 1727096231.95979: done checking for any_errors_fatal 15500 1727096231.95980: checking for max_fail_percentage 15500 1727096231.95982: done checking for max_fail_percentage 15500 1727096231.95983: checking to see if all hosts have failed and the running result is not ok 15500 1727096231.95984: done checking to see if all hosts have failed 15500 1727096231.95985: getting the remaining hosts for this loop 15500 1727096231.95986: done getting the remaining hosts for this loop 15500 1727096231.95990: getting the next task for host managed_node1 15500 1727096231.95999: done getting next task for host managed_node1 15500 1727096231.96004: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15500 1727096231.96007: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096231.96020: getting variables 15500 1727096231.96023: in VariableManager get_vars() 15500 1727096231.96068: Calling all_inventory to load vars for managed_node1 15500 1727096231.96072: Calling groups_inventory to load vars for managed_node1 15500 1727096231.96075: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096231.96086: Calling all_plugins_play to load vars for managed_node1 15500 1727096231.96090: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096231.96094: Calling groups_plugins_play to load vars for managed_node1 15500 1727096231.97074: done sending task result for task 0afff68d-5257-877d-2da0-0000000003e2 15500 1727096231.97077: WORKER PROCESS EXITING 15500 1727096231.99228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096232.02431: done with get_vars() 15500 1727096232.02465: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:57:12 -0400 (0:00:00.173) 0:00:32.068 ****** 15500 1727096232.02769: entering _queue_task() for managed_node1/stat 15500 1727096232.03535: worker is 1 (out of 1 available) 15500 1727096232.03549: exiting _queue_task() for managed_node1/stat 15500 1727096232.03565: done queuing things up, now waiting for results queue to drain 15500 1727096232.03567: waiting for pending results... 15500 1727096232.04049: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15500 1727096232.04484: in run() - task 0afff68d-5257-877d-2da0-0000000003e4 15500 1727096232.04488: variable 'ansible_search_path' from source: unknown 15500 1727096232.04491: variable 'ansible_search_path' from source: unknown 15500 1727096232.04493: calling self._execute() 15500 1727096232.04650: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096232.04664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096232.04683: variable 'omit' from source: magic vars 15500 1727096232.05408: variable 'ansible_distribution_major_version' from source: facts 15500 1727096232.05423: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096232.05779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096232.06319: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096232.06417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096232.06663: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096232.06667: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096232.06797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096232.06825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096232.06911: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096232.06941: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096232.07172: variable '__network_is_ostree' from source: set_fact 15500 1727096232.07176: Evaluated conditional (not __network_is_ostree is defined): False 15500 1727096232.07180: when evaluation is False, skipping this task 15500 1727096232.07213: _execute() done 15500 1727096232.07220: dumping result to json 15500 1727096232.07228: done dumping result, returning 15500 1727096232.07239: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-877d-2da0-0000000003e4] 15500 1727096232.07324: sending task result for task 0afff68d-5257-877d-2da0-0000000003e4 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15500 1727096232.07478: no more pending results, returning what we have 15500 1727096232.07482: results queue empty 15500 1727096232.07483: checking for any_errors_fatal 15500 1727096232.07490: done checking for any_errors_fatal 15500 1727096232.07491: checking for max_fail_percentage 15500 1727096232.07493: done checking for max_fail_percentage 15500 1727096232.07494: checking to see if all hosts have failed and the running result is not ok 15500 1727096232.07495: done checking to see if all hosts have failed 15500 1727096232.07496: getting the remaining hosts for this loop 15500 1727096232.07497: done getting the remaining hosts for this loop 15500 1727096232.07501: getting the next task for host managed_node1 15500 1727096232.07507: done getting next task for host managed_node1 15500 1727096232.07512: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15500 1727096232.07515: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096232.07530: getting variables 15500 1727096232.07532: in VariableManager get_vars() 15500 1727096232.07578: Calling all_inventory to load vars for managed_node1 15500 1727096232.07581: Calling groups_inventory to load vars for managed_node1 15500 1727096232.07584: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096232.07594: Calling all_plugins_play to load vars for managed_node1 15500 1727096232.07598: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096232.07601: Calling groups_plugins_play to load vars for managed_node1 15500 1727096232.08516: done sending task result for task 0afff68d-5257-877d-2da0-0000000003e4 15500 1727096232.08519: WORKER PROCESS EXITING 15500 1727096232.11196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096232.15827: done with get_vars() 15500 1727096232.15868: done getting variables 15500 1727096232.15931: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:57:12 -0400 (0:00:00.134) 0:00:32.202 ****** 15500 1727096232.16211: entering _queue_task() for managed_node1/set_fact 15500 1727096232.17102: worker is 1 (out of 1 available) 15500 1727096232.17112: exiting _queue_task() for managed_node1/set_fact 15500 1727096232.17122: done queuing things up, now waiting for results queue to drain 15500 1727096232.17124: waiting for pending results... 15500 1727096232.17464: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15500 1727096232.17713: in run() - task 0afff68d-5257-877d-2da0-0000000003e5 15500 1727096232.17799: variable 'ansible_search_path' from source: unknown 15500 1727096232.17807: variable 'ansible_search_path' from source: unknown 15500 1727096232.17849: calling self._execute() 15500 1727096232.18095: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096232.18112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096232.18325: variable 'omit' from source: magic vars 15500 1727096232.18944: variable 'ansible_distribution_major_version' from source: facts 15500 1727096232.19019: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096232.19573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096232.20122: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096232.20226: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096232.20264: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096232.20434: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096232.20603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096232.20639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096232.20674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096232.20752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096232.21183: variable '__network_is_ostree' from source: set_fact 15500 1727096232.21187: Evaluated conditional (not __network_is_ostree is defined): False 15500 1727096232.21189: when evaluation is False, skipping this task 15500 1727096232.21191: _execute() done 15500 1727096232.21194: dumping result to json 15500 1727096232.21201: done dumping result, returning 15500 1727096232.21204: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-877d-2da0-0000000003e5] 15500 1727096232.21206: sending task result for task 0afff68d-5257-877d-2da0-0000000003e5 15500 1727096232.21283: done sending task result for task 0afff68d-5257-877d-2da0-0000000003e5 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15500 1727096232.21348: no more pending results, returning what we have 15500 1727096232.21352: results queue empty 15500 1727096232.21353: checking for any_errors_fatal 15500 1727096232.21362: done checking for any_errors_fatal 15500 1727096232.21363: checking for max_fail_percentage 15500 1727096232.21365: done checking for max_fail_percentage 15500 1727096232.21366: checking to see if all hosts have failed and the running result is not ok 15500 1727096232.21369: done checking to see if all hosts have failed 15500 1727096232.21370: getting the remaining hosts for this loop 15500 1727096232.21371: done getting the remaining hosts for this loop 15500 1727096232.21375: getting the next task for host managed_node1 15500 1727096232.21385: done getting next task for host managed_node1 15500 1727096232.21389: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15500 1727096232.21392: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096232.21409: getting variables 15500 1727096232.21412: in VariableManager get_vars() 15500 1727096232.21456: Calling all_inventory to load vars for managed_node1 15500 1727096232.21461: Calling groups_inventory to load vars for managed_node1 15500 1727096232.21464: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096232.21679: Calling all_plugins_play to load vars for managed_node1 15500 1727096232.21683: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096232.21686: Calling groups_plugins_play to load vars for managed_node1 15500 1727096232.22344: WORKER PROCESS EXITING 15500 1727096232.24506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096232.27872: done with get_vars() 15500 1727096232.28023: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:57:12 -0400 (0:00:00.122) 0:00:32.325 ****** 15500 1727096232.28241: entering _queue_task() for managed_node1/service_facts 15500 1727096232.29057: worker is 1 (out of 1 available) 15500 1727096232.29072: exiting _queue_task() for managed_node1/service_facts 15500 1727096232.29127: done queuing things up, now waiting for results queue to drain 15500 1727096232.29129: waiting for pending results... 15500 1727096232.29850: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15500 1727096232.30210: in run() - task 0afff68d-5257-877d-2da0-0000000003e7 15500 1727096232.30674: variable 'ansible_search_path' from source: unknown 15500 1727096232.30678: variable 'ansible_search_path' from source: unknown 15500 1727096232.30681: calling self._execute() 15500 1727096232.30684: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096232.30687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096232.30689: variable 'omit' from source: magic vars 15500 1727096232.31873: variable 'ansible_distribution_major_version' from source: facts 15500 1727096232.32098: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096232.32112: variable 'omit' from source: magic vars 15500 1727096232.32178: variable 'omit' from source: magic vars 15500 1727096232.32406: variable 'omit' from source: magic vars 15500 1727096232.32446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096232.32873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096232.32876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096232.32879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096232.32882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096232.32884: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096232.32886: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096232.32888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096232.33472: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096232.33475: Set connection var ansible_pipelining to False 15500 1727096232.33478: Set connection var ansible_timeout to 10 15500 1727096232.33481: Set connection var ansible_shell_type to sh 15500 1727096232.33483: Set connection var ansible_shell_executable to /bin/sh 15500 1727096232.33485: Set connection var ansible_connection to ssh 15500 1727096232.33487: variable 'ansible_shell_executable' from source: unknown 15500 1727096232.33489: variable 'ansible_connection' from source: unknown 15500 1727096232.33491: variable 'ansible_module_compression' from source: unknown 15500 1727096232.33493: variable 'ansible_shell_type' from source: unknown 15500 1727096232.33495: variable 'ansible_shell_executable' from source: unknown 15500 1727096232.33497: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096232.33498: variable 'ansible_pipelining' from source: unknown 15500 1727096232.33500: variable 'ansible_timeout' from source: unknown 15500 1727096232.33502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096232.33952: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096232.33974: variable 'omit' from source: magic vars 15500 1727096232.34180: starting attempt loop 15500 1727096232.34190: running the handler 15500 1727096232.34202: _low_level_execute_command(): starting 15500 1727096232.34213: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096232.35535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096232.35878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096232.35897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096232.36001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096232.37725: stdout chunk (state=3): >>>/root <<< 15500 1727096232.37883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096232.38025: stderr chunk (state=3): >>><<< 15500 1727096232.38036: stdout chunk (state=3): >>><<< 15500 1727096232.38070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096232.38374: _low_level_execute_command(): starting 15500 1727096232.38385: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033 `" && echo ansible-tmp-1727096232.3827536-16862-234632883669033="` echo /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033 `" ) && sleep 0' 15500 1727096232.39310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096232.39576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096232.39784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096232.39985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096232.42030: stdout chunk (state=3): >>>ansible-tmp-1727096232.3827536-16862-234632883669033=/root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033 <<< 15500 1727096232.42173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096232.42185: stdout chunk (state=3): >>><<< 15500 1727096232.42197: stderr chunk (state=3): >>><<< 15500 1727096232.42220: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096232.3827536-16862-234632883669033=/root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096232.42280: variable 'ansible_module_compression' from source: unknown 15500 1727096232.42418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15500 1727096232.42618: variable 'ansible_facts' from source: unknown 15500 1727096232.42715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/AnsiballZ_service_facts.py 15500 1727096232.43194: Sending initial data 15500 1727096232.43197: Sent initial data (162 bytes) 15500 1727096232.44129: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096232.44181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096232.44437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096232.44462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096232.44551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096232.46315: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096232.46480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096232.46549: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpr48lcyaz /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/AnsiballZ_service_facts.py <<< 15500 1727096232.46563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/AnsiballZ_service_facts.py" <<< 15500 1727096232.46616: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpr48lcyaz" to remote "/root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/AnsiballZ_service_facts.py" <<< 15500 1727096232.48878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096232.48882: stdout chunk (state=3): >>><<< 15500 1727096232.48885: stderr chunk (state=3): >>><<< 15500 1727096232.48887: done transferring module to remote 15500 1727096232.48889: _low_level_execute_command(): starting 15500 1727096232.48891: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/ /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/AnsiballZ_service_facts.py && sleep 0' 15500 1727096232.49948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096232.50280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096232.50298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096232.50565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096232.52491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096232.52512: stderr chunk (state=3): >>><<< 15500 1727096232.52523: stdout chunk (state=3): >>><<< 15500 1727096232.52548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096232.52861: _low_level_execute_command(): starting 15500 1727096232.52865: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/AnsiballZ_service_facts.py && sleep 0' 15500 1727096232.53800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096232.53983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096232.54288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096232.54397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096234.15006: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 15500 1727096234.15091: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 15500 1727096234.15127: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15500 1727096234.16722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096234.16727: stdout chunk (state=3): >>><<< 15500 1727096234.16729: stderr chunk (state=3): >>><<< 15500 1727096234.16756: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096234.28614: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096234.28619: _low_level_execute_command(): starting 15500 1727096234.28720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096232.3827536-16862-234632883669033/ > /dev/null 2>&1 && sleep 0' 15500 1727096234.30125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096234.30129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096234.30135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096234.30138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096234.30583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096234.30619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096234.30782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096234.30802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096234.30957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096234.33175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096234.33179: stdout chunk (state=3): >>><<< 15500 1727096234.33182: stderr chunk (state=3): >>><<< 15500 1727096234.33184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096234.33187: handler run complete 15500 1727096234.33280: variable 'ansible_facts' from source: unknown 15500 1727096234.33439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096234.33981: variable 'ansible_facts' from source: unknown 15500 1727096234.34130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096234.34340: attempt loop complete, returning result 15500 1727096234.34351: _execute() done 15500 1727096234.34361: dumping result to json 15500 1727096234.34429: done dumping result, returning 15500 1727096234.34442: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-877d-2da0-0000000003e7] 15500 1727096234.34450: sending task result for task 0afff68d-5257-877d-2da0-0000000003e7 15500 1727096234.42399: done sending task result for task 0afff68d-5257-877d-2da0-0000000003e7 15500 1727096234.42403: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096234.42500: no more pending results, returning what we have 15500 1727096234.42502: results queue empty 15500 1727096234.42503: checking for any_errors_fatal 15500 1727096234.42506: done checking for any_errors_fatal 15500 1727096234.42507: checking for max_fail_percentage 15500 1727096234.42508: done checking for max_fail_percentage 15500 1727096234.42509: checking to see if all hosts have failed and the running result is not ok 15500 1727096234.42510: done checking to see if all hosts have failed 15500 1727096234.42510: getting the remaining hosts for this loop 15500 1727096234.42511: done getting the remaining hosts for this loop 15500 1727096234.42514: getting the next task for host managed_node1 15500 1727096234.42518: done getting next task for host managed_node1 15500 1727096234.42520: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15500 1727096234.42523: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096234.42530: getting variables 15500 1727096234.42532: in VariableManager get_vars() 15500 1727096234.42553: Calling all_inventory to load vars for managed_node1 15500 1727096234.42555: Calling groups_inventory to load vars for managed_node1 15500 1727096234.42560: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096234.42567: Calling all_plugins_play to load vars for managed_node1 15500 1727096234.42572: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096234.42575: Calling groups_plugins_play to load vars for managed_node1 15500 1727096234.43779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096234.46566: done with get_vars() 15500 1727096234.46598: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:57:14 -0400 (0:00:02.184) 0:00:34.510 ****** 15500 1727096234.46696: entering _queue_task() for managed_node1/package_facts 15500 1727096234.47178: worker is 1 (out of 1 available) 15500 1727096234.47203: exiting _queue_task() for managed_node1/package_facts 15500 1727096234.47214: done queuing things up, now waiting for results queue to drain 15500 1727096234.47216: waiting for pending results... 15500 1727096234.47485: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15500 1727096234.47583: in run() - task 0afff68d-5257-877d-2da0-0000000003e8 15500 1727096234.47599: variable 'ansible_search_path' from source: unknown 15500 1727096234.47603: variable 'ansible_search_path' from source: unknown 15500 1727096234.47639: calling self._execute() 15500 1727096234.47711: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096234.47719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096234.47745: variable 'omit' from source: magic vars 15500 1727096234.48056: variable 'ansible_distribution_major_version' from source: facts 15500 1727096234.48070: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096234.48076: variable 'omit' from source: magic vars 15500 1727096234.48112: variable 'omit' from source: magic vars 15500 1727096234.48137: variable 'omit' from source: magic vars 15500 1727096234.48175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096234.48243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096234.48247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096234.48260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096234.48328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096234.48331: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096234.48334: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096234.48336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096234.48413: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096234.48417: Set connection var ansible_pipelining to False 15500 1727096234.48422: Set connection var ansible_timeout to 10 15500 1727096234.48432: Set connection var ansible_shell_type to sh 15500 1727096234.48435: Set connection var ansible_shell_executable to /bin/sh 15500 1727096234.48438: Set connection var ansible_connection to ssh 15500 1727096234.48455: variable 'ansible_shell_executable' from source: unknown 15500 1727096234.48458: variable 'ansible_connection' from source: unknown 15500 1727096234.48461: variable 'ansible_module_compression' from source: unknown 15500 1727096234.48465: variable 'ansible_shell_type' from source: unknown 15500 1727096234.48470: variable 'ansible_shell_executable' from source: unknown 15500 1727096234.48472: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096234.48476: variable 'ansible_pipelining' from source: unknown 15500 1727096234.48479: variable 'ansible_timeout' from source: unknown 15500 1727096234.48483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096234.48631: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096234.48642: variable 'omit' from source: magic vars 15500 1727096234.48645: starting attempt loop 15500 1727096234.48648: running the handler 15500 1727096234.48665: _low_level_execute_command(): starting 15500 1727096234.48672: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096234.49386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096234.49606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096234.49627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096234.49730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096234.51465: stdout chunk (state=3): >>>/root <<< 15500 1727096234.51591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096234.51633: stderr chunk (state=3): >>><<< 15500 1727096234.51672: stdout chunk (state=3): >>><<< 15500 1727096234.51827: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096234.51834: _low_level_execute_command(): starting 15500 1727096234.51838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436 `" && echo ansible-tmp-1727096234.5170658-16939-104538520934436="` echo /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436 `" ) && sleep 0' 15500 1727096234.52791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096234.52797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096234.52801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096234.52960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096234.53051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096234.53086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096234.53345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096234.55370: stdout chunk (state=3): >>>ansible-tmp-1727096234.5170658-16939-104538520934436=/root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436 <<< 15500 1727096234.55495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096234.55546: stderr chunk (state=3): >>><<< 15500 1727096234.55549: stdout chunk (state=3): >>><<< 15500 1727096234.55570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096234.5170658-16939-104538520934436=/root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096234.55781: variable 'ansible_module_compression' from source: unknown 15500 1727096234.55785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15500 1727096234.55787: variable 'ansible_facts' from source: unknown 15500 1727096234.55962: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/AnsiballZ_package_facts.py 15500 1727096234.56242: Sending initial data 15500 1727096234.56246: Sent initial data (162 bytes) 15500 1727096234.56959: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096234.57095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096234.57252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096234.57283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096234.57455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096234.59143: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096234.59207: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096234.59279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpw99hi5fu /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/AnsiballZ_package_facts.py <<< 15500 1727096234.59284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/AnsiballZ_package_facts.py" <<< 15500 1727096234.59484: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpw99hi5fu" to remote "/root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/AnsiballZ_package_facts.py" <<< 15500 1727096234.62897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096234.63055: stderr chunk (state=3): >>><<< 15500 1727096234.63059: stdout chunk (state=3): >>><<< 15500 1727096234.63061: done transferring module to remote 15500 1727096234.63063: _low_level_execute_command(): starting 15500 1727096234.63066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/ /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/AnsiballZ_package_facts.py && sleep 0' 15500 1727096234.63601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096234.63614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096234.63629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096234.63648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096234.63672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096234.63871: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096234.63896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096234.64197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096234.66164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096234.66177: stdout chunk (state=3): >>><<< 15500 1727096234.66180: stderr chunk (state=3): >>><<< 15500 1727096234.66195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096234.66202: _low_level_execute_command(): starting 15500 1727096234.66212: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/AnsiballZ_package_facts.py && sleep 0' 15500 1727096234.67406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096234.67424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096234.67450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096234.67472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096234.67489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096234.67585: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096234.67787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096234.67903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096234.68029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096234.68110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096235.13146: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growp<<< 15500 1727096235.13200: stdout chunk (state=3): >>>art": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15500 1727096235.14784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096235.14795: stdout chunk (state=3): >>><<< 15500 1727096235.14807: stderr chunk (state=3): >>><<< 15500 1727096235.14842: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096235.17175: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096235.17180: _low_level_execute_command(): starting 15500 1727096235.17183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096234.5170658-16939-104538520934436/ > /dev/null 2>&1 && sleep 0' 15500 1727096235.17886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096235.17917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096235.17935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096235.17953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096235.17985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096235.18090: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096235.18102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096235.18128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096235.18237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096235.20179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096235.20318: stderr chunk (state=3): >>><<< 15500 1727096235.20326: stdout chunk (state=3): >>><<< 15500 1727096235.20347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096235.20358: handler run complete 15500 1727096235.21516: variable 'ansible_facts' from source: unknown 15500 1727096235.22466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.25672: variable 'ansible_facts' from source: unknown 15500 1727096235.26559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.28229: attempt loop complete, returning result 15500 1727096235.28375: _execute() done 15500 1727096235.28385: dumping result to json 15500 1727096235.28951: done dumping result, returning 15500 1727096235.29051: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-877d-2da0-0000000003e8] 15500 1727096235.29056: sending task result for task 0afff68d-5257-877d-2da0-0000000003e8 15500 1727096235.33494: done sending task result for task 0afff68d-5257-877d-2da0-0000000003e8 15500 1727096235.33499: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096235.33610: no more pending results, returning what we have 15500 1727096235.33613: results queue empty 15500 1727096235.33614: checking for any_errors_fatal 15500 1727096235.33619: done checking for any_errors_fatal 15500 1727096235.33620: checking for max_fail_percentage 15500 1727096235.33627: done checking for max_fail_percentage 15500 1727096235.33628: checking to see if all hosts have failed and the running result is not ok 15500 1727096235.33629: done checking to see if all hosts have failed 15500 1727096235.33630: getting the remaining hosts for this loop 15500 1727096235.33631: done getting the remaining hosts for this loop 15500 1727096235.33635: getting the next task for host managed_node1 15500 1727096235.33641: done getting next task for host managed_node1 15500 1727096235.33645: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15500 1727096235.33647: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096235.33656: getting variables 15500 1727096235.33658: in VariableManager get_vars() 15500 1727096235.33695: Calling all_inventory to load vars for managed_node1 15500 1727096235.33698: Calling groups_inventory to load vars for managed_node1 15500 1727096235.33701: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096235.33710: Calling all_plugins_play to load vars for managed_node1 15500 1727096235.33713: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096235.33716: Calling groups_plugins_play to load vars for managed_node1 15500 1727096235.35941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.39122: done with get_vars() 15500 1727096235.39244: done getting variables 15500 1727096235.39311: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:57:15 -0400 (0:00:00.927) 0:00:35.437 ****** 15500 1727096235.39475: entering _queue_task() for managed_node1/debug 15500 1727096235.39951: worker is 1 (out of 1 available) 15500 1727096235.39966: exiting _queue_task() for managed_node1/debug 15500 1727096235.40099: done queuing things up, now waiting for results queue to drain 15500 1727096235.40100: waiting for pending results... 15500 1727096235.40321: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15500 1727096235.40426: in run() - task 0afff68d-5257-877d-2da0-00000000005b 15500 1727096235.40525: variable 'ansible_search_path' from source: unknown 15500 1727096235.40529: variable 'ansible_search_path' from source: unknown 15500 1727096235.40532: calling self._execute() 15500 1727096235.40613: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.40631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.40652: variable 'omit' from source: magic vars 15500 1727096235.41075: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.41096: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096235.41108: variable 'omit' from source: magic vars 15500 1727096235.41150: variable 'omit' from source: magic vars 15500 1727096235.41270: variable 'network_provider' from source: set_fact 15500 1727096235.41299: variable 'omit' from source: magic vars 15500 1727096235.41374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096235.41399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096235.41432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096235.41456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096235.41475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096235.41520: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096235.41527: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.41532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.41643: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096235.41653: Set connection var ansible_pipelining to False 15500 1727096235.41704: Set connection var ansible_timeout to 10 15500 1727096235.41707: Set connection var ansible_shell_type to sh 15500 1727096235.41709: Set connection var ansible_shell_executable to /bin/sh 15500 1727096235.41715: Set connection var ansible_connection to ssh 15500 1727096235.41717: variable 'ansible_shell_executable' from source: unknown 15500 1727096235.41725: variable 'ansible_connection' from source: unknown 15500 1727096235.41730: variable 'ansible_module_compression' from source: unknown 15500 1727096235.41738: variable 'ansible_shell_type' from source: unknown 15500 1727096235.41744: variable 'ansible_shell_executable' from source: unknown 15500 1727096235.41748: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.41754: variable 'ansible_pipelining' from source: unknown 15500 1727096235.41762: variable 'ansible_timeout' from source: unknown 15500 1727096235.41813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.41932: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096235.41949: variable 'omit' from source: magic vars 15500 1727096235.41962: starting attempt loop 15500 1727096235.41971: running the handler 15500 1727096235.42018: handler run complete 15500 1727096235.42045: attempt loop complete, returning result 15500 1727096235.42051: _execute() done 15500 1727096235.42056: dumping result to json 15500 1727096235.42150: done dumping result, returning 15500 1727096235.42154: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-877d-2da0-00000000005b] 15500 1727096235.42157: sending task result for task 0afff68d-5257-877d-2da0-00000000005b 15500 1727096235.42231: done sending task result for task 0afff68d-5257-877d-2da0-00000000005b 15500 1727096235.42235: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 15500 1727096235.42304: no more pending results, returning what we have 15500 1727096235.42308: results queue empty 15500 1727096235.42309: checking for any_errors_fatal 15500 1727096235.42319: done checking for any_errors_fatal 15500 1727096235.42320: checking for max_fail_percentage 15500 1727096235.42322: done checking for max_fail_percentage 15500 1727096235.42323: checking to see if all hosts have failed and the running result is not ok 15500 1727096235.42324: done checking to see if all hosts have failed 15500 1727096235.42325: getting the remaining hosts for this loop 15500 1727096235.42326: done getting the remaining hosts for this loop 15500 1727096235.42330: getting the next task for host managed_node1 15500 1727096235.42337: done getting next task for host managed_node1 15500 1727096235.42342: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15500 1727096235.42344: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096235.42354: getting variables 15500 1727096235.42356: in VariableManager get_vars() 15500 1727096235.42403: Calling all_inventory to load vars for managed_node1 15500 1727096235.42406: Calling groups_inventory to load vars for managed_node1 15500 1727096235.42408: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096235.42420: Calling all_plugins_play to load vars for managed_node1 15500 1727096235.42423: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096235.42427: Calling groups_plugins_play to load vars for managed_node1 15500 1727096235.45320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.47299: done with get_vars() 15500 1727096235.47340: done getting variables 15500 1727096235.47425: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:57:15 -0400 (0:00:00.080) 0:00:35.517 ****** 15500 1727096235.47460: entering _queue_task() for managed_node1/fail 15500 1727096235.47865: worker is 1 (out of 1 available) 15500 1727096235.47880: exiting _queue_task() for managed_node1/fail 15500 1727096235.47894: done queuing things up, now waiting for results queue to drain 15500 1727096235.47896: waiting for pending results... 15500 1727096235.48349: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15500 1727096235.48660: in run() - task 0afff68d-5257-877d-2da0-00000000005c 15500 1727096235.48665: variable 'ansible_search_path' from source: unknown 15500 1727096235.48670: variable 'ansible_search_path' from source: unknown 15500 1727096235.48875: calling self._execute() 15500 1727096235.48984: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.48998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.49013: variable 'omit' from source: magic vars 15500 1727096235.49808: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.49886: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096235.50050: variable 'network_state' from source: role '' defaults 15500 1727096235.50080: Evaluated conditional (network_state != {}): False 15500 1727096235.50088: when evaluation is False, skipping this task 15500 1727096235.50097: _execute() done 15500 1727096235.50105: dumping result to json 15500 1727096235.50114: done dumping result, returning 15500 1727096235.50128: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-877d-2da0-00000000005c] 15500 1727096235.50139: sending task result for task 0afff68d-5257-877d-2da0-00000000005c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096235.50305: no more pending results, returning what we have 15500 1727096235.50309: results queue empty 15500 1727096235.50310: checking for any_errors_fatal 15500 1727096235.50319: done checking for any_errors_fatal 15500 1727096235.50319: checking for max_fail_percentage 15500 1727096235.50321: done checking for max_fail_percentage 15500 1727096235.50322: checking to see if all hosts have failed and the running result is not ok 15500 1727096235.50323: done checking to see if all hosts have failed 15500 1727096235.50323: getting the remaining hosts for this loop 15500 1727096235.50325: done getting the remaining hosts for this loop 15500 1727096235.50329: getting the next task for host managed_node1 15500 1727096235.50335: done getting next task for host managed_node1 15500 1727096235.50339: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15500 1727096235.50341: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096235.50357: getting variables 15500 1727096235.50359: in VariableManager get_vars() 15500 1727096235.50402: Calling all_inventory to load vars for managed_node1 15500 1727096235.50404: Calling groups_inventory to load vars for managed_node1 15500 1727096235.50406: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096235.50419: Calling all_plugins_play to load vars for managed_node1 15500 1727096235.50422: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096235.50425: Calling groups_plugins_play to load vars for managed_node1 15500 1727096235.50993: done sending task result for task 0afff68d-5257-877d-2da0-00000000005c 15500 1727096235.50997: WORKER PROCESS EXITING 15500 1727096235.51995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.54101: done with get_vars() 15500 1727096235.54143: done getting variables 15500 1727096235.54273: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:57:15 -0400 (0:00:00.068) 0:00:35.586 ****** 15500 1727096235.54326: entering _queue_task() for managed_node1/fail 15500 1727096235.54838: worker is 1 (out of 1 available) 15500 1727096235.55008: exiting _queue_task() for managed_node1/fail 15500 1727096235.55018: done queuing things up, now waiting for results queue to drain 15500 1727096235.55020: waiting for pending results... 15500 1727096235.55316: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15500 1727096235.55480: in run() - task 0afff68d-5257-877d-2da0-00000000005d 15500 1727096235.55506: variable 'ansible_search_path' from source: unknown 15500 1727096235.55515: variable 'ansible_search_path' from source: unknown 15500 1727096235.55564: calling self._execute() 15500 1727096235.55691: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.55704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.55775: variable 'omit' from source: magic vars 15500 1727096235.56299: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.56334: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096235.56536: variable 'network_state' from source: role '' defaults 15500 1727096235.56540: Evaluated conditional (network_state != {}): False 15500 1727096235.56543: when evaluation is False, skipping this task 15500 1727096235.56545: _execute() done 15500 1727096235.56548: dumping result to json 15500 1727096235.56550: done dumping result, returning 15500 1727096235.56554: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-877d-2da0-00000000005d] 15500 1727096235.56556: sending task result for task 0afff68d-5257-877d-2da0-00000000005d 15500 1727096235.56932: done sending task result for task 0afff68d-5257-877d-2da0-00000000005d 15500 1727096235.56935: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096235.56997: no more pending results, returning what we have 15500 1727096235.57000: results queue empty 15500 1727096235.57001: checking for any_errors_fatal 15500 1727096235.57008: done checking for any_errors_fatal 15500 1727096235.57009: checking for max_fail_percentage 15500 1727096235.57011: done checking for max_fail_percentage 15500 1727096235.57012: checking to see if all hosts have failed and the running result is not ok 15500 1727096235.57013: done checking to see if all hosts have failed 15500 1727096235.57014: getting the remaining hosts for this loop 15500 1727096235.57015: done getting the remaining hosts for this loop 15500 1727096235.57018: getting the next task for host managed_node1 15500 1727096235.57024: done getting next task for host managed_node1 15500 1727096235.57028: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15500 1727096235.57031: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096235.57050: getting variables 15500 1727096235.57052: in VariableManager get_vars() 15500 1727096235.57098: Calling all_inventory to load vars for managed_node1 15500 1727096235.57101: Calling groups_inventory to load vars for managed_node1 15500 1727096235.57104: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096235.57115: Calling all_plugins_play to load vars for managed_node1 15500 1727096235.57118: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096235.57121: Calling groups_plugins_play to load vars for managed_node1 15500 1727096235.58789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.60719: done with get_vars() 15500 1727096235.60873: done getting variables 15500 1727096235.61198: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:57:15 -0400 (0:00:00.069) 0:00:35.655 ****** 15500 1727096235.61234: entering _queue_task() for managed_node1/fail 15500 1727096235.62374: worker is 1 (out of 1 available) 15500 1727096235.62394: exiting _queue_task() for managed_node1/fail 15500 1727096235.62408: done queuing things up, now waiting for results queue to drain 15500 1727096235.62409: waiting for pending results... 15500 1727096235.62787: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15500 1727096235.62838: in run() - task 0afff68d-5257-877d-2da0-00000000005e 15500 1727096235.62864: variable 'ansible_search_path' from source: unknown 15500 1727096235.62875: variable 'ansible_search_path' from source: unknown 15500 1727096235.62921: calling self._execute() 15500 1727096235.63038: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.63051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.63073: variable 'omit' from source: magic vars 15500 1727096235.63673: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.63678: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096235.63707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096235.65972: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096235.66132: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096235.66174: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096235.66223: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096235.66252: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096235.66390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.66577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.66605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.66696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.66893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.67003: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.67019: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15500 1727096235.67172: variable 'ansible_distribution' from source: facts 15500 1727096235.67176: variable '__network_rh_distros' from source: role '' defaults 15500 1727096235.67196: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15500 1727096235.67866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.67873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.67876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.67924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.67943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.68092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.68120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.68197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.68275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.68291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.68341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.68371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.68553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.68661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.68794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.69507: variable 'network_connections' from source: play vars 15500 1727096235.69512: variable 'profile' from source: play vars 15500 1727096235.69514: variable 'profile' from source: play vars 15500 1727096235.69516: variable 'interface' from source: set_fact 15500 1727096235.69573: variable 'interface' from source: set_fact 15500 1727096235.69583: variable 'network_state' from source: role '' defaults 15500 1727096235.69925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096235.70270: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096235.70475: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096235.70478: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096235.70481: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096235.70799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096235.70823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096235.70857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.70884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096235.71108: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15500 1727096235.71111: when evaluation is False, skipping this task 15500 1727096235.71114: _execute() done 15500 1727096235.71117: dumping result to json 15500 1727096235.71119: done dumping result, returning 15500 1727096235.71129: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-877d-2da0-00000000005e] 15500 1727096235.71132: sending task result for task 0afff68d-5257-877d-2da0-00000000005e 15500 1727096235.71308: done sending task result for task 0afff68d-5257-877d-2da0-00000000005e 15500 1727096235.71311: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15500 1727096235.71358: no more pending results, returning what we have 15500 1727096235.71362: results queue empty 15500 1727096235.71363: checking for any_errors_fatal 15500 1727096235.71372: done checking for any_errors_fatal 15500 1727096235.71373: checking for max_fail_percentage 15500 1727096235.71375: done checking for max_fail_percentage 15500 1727096235.71376: checking to see if all hosts have failed and the running result is not ok 15500 1727096235.71377: done checking to see if all hosts have failed 15500 1727096235.71378: getting the remaining hosts for this loop 15500 1727096235.71379: done getting the remaining hosts for this loop 15500 1727096235.71383: getting the next task for host managed_node1 15500 1727096235.71389: done getting next task for host managed_node1 15500 1727096235.71476: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15500 1727096235.71479: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096235.71493: getting variables 15500 1727096235.71495: in VariableManager get_vars() 15500 1727096235.71739: Calling all_inventory to load vars for managed_node1 15500 1727096235.71742: Calling groups_inventory to load vars for managed_node1 15500 1727096235.71744: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096235.71755: Calling all_plugins_play to load vars for managed_node1 15500 1727096235.71758: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096235.71761: Calling groups_plugins_play to load vars for managed_node1 15500 1727096235.74257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.76296: done with get_vars() 15500 1727096235.76321: done getting variables 15500 1727096235.76398: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:57:15 -0400 (0:00:00.151) 0:00:35.807 ****** 15500 1727096235.76429: entering _queue_task() for managed_node1/dnf 15500 1727096235.76936: worker is 1 (out of 1 available) 15500 1727096235.77018: exiting _queue_task() for managed_node1/dnf 15500 1727096235.77031: done queuing things up, now waiting for results queue to drain 15500 1727096235.77032: waiting for pending results... 15500 1727096235.77786: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15500 1727096235.77794: in run() - task 0afff68d-5257-877d-2da0-00000000005f 15500 1727096235.77798: variable 'ansible_search_path' from source: unknown 15500 1727096235.77801: variable 'ansible_search_path' from source: unknown 15500 1727096235.77817: calling self._execute() 15500 1727096235.77926: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.78174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.78177: variable 'omit' from source: magic vars 15500 1727096235.78775: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.78793: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096235.79002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096235.81441: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096235.81777: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096235.81868: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096235.81940: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096235.82272: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096235.82277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.82279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.82282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.82496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.82516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.82644: variable 'ansible_distribution' from source: facts 15500 1727096235.82766: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.82812: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15500 1727096235.83173: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096235.83487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.83518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.83549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.83609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.83628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.83677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.83705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.83732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.83779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.83797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.83843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.83875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.83906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.83950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.83973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.84137: variable 'network_connections' from source: play vars 15500 1727096235.84155: variable 'profile' from source: play vars 15500 1727096235.84231: variable 'profile' from source: play vars 15500 1727096235.84243: variable 'interface' from source: set_fact 15500 1727096235.84311: variable 'interface' from source: set_fact 15500 1727096235.84394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096235.84571: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096235.84612: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096235.84647: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096235.84686: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096235.84734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096235.84972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096235.84984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.84987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096235.84990: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096235.85142: variable 'network_connections' from source: play vars 15500 1727096235.85153: variable 'profile' from source: play vars 15500 1727096235.85223: variable 'profile' from source: play vars 15500 1727096235.85233: variable 'interface' from source: set_fact 15500 1727096235.85299: variable 'interface' from source: set_fact 15500 1727096235.85329: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096235.85337: when evaluation is False, skipping this task 15500 1727096235.85344: _execute() done 15500 1727096235.85350: dumping result to json 15500 1727096235.85357: done dumping result, returning 15500 1727096235.85373: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-877d-2da0-00000000005f] 15500 1727096235.85382: sending task result for task 0afff68d-5257-877d-2da0-00000000005f skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096235.85534: no more pending results, returning what we have 15500 1727096235.85569: results queue empty 15500 1727096235.85570: checking for any_errors_fatal 15500 1727096235.85579: done checking for any_errors_fatal 15500 1727096235.85579: checking for max_fail_percentage 15500 1727096235.85581: done checking for max_fail_percentage 15500 1727096235.85582: checking to see if all hosts have failed and the running result is not ok 15500 1727096235.85583: done checking to see if all hosts have failed 15500 1727096235.85583: getting the remaining hosts for this loop 15500 1727096235.85585: done getting the remaining hosts for this loop 15500 1727096235.85589: getting the next task for host managed_node1 15500 1727096235.85595: done getting next task for host managed_node1 15500 1727096235.85599: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15500 1727096235.85601: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096235.85618: getting variables 15500 1727096235.85619: in VariableManager get_vars() 15500 1727096235.85770: Calling all_inventory to load vars for managed_node1 15500 1727096235.85773: Calling groups_inventory to load vars for managed_node1 15500 1727096235.85776: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096235.85786: Calling all_plugins_play to load vars for managed_node1 15500 1727096235.85788: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096235.85791: Calling groups_plugins_play to load vars for managed_node1 15500 1727096235.86310: done sending task result for task 0afff68d-5257-877d-2da0-00000000005f 15500 1727096235.86315: WORKER PROCESS EXITING 15500 1727096235.88659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.90436: done with get_vars() 15500 1727096235.90481: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15500 1727096235.90562: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:57:15 -0400 (0:00:00.141) 0:00:35.949 ****** 15500 1727096235.90602: entering _queue_task() for managed_node1/yum 15500 1727096235.90994: worker is 1 (out of 1 available) 15500 1727096235.91147: exiting _queue_task() for managed_node1/yum 15500 1727096235.91159: done queuing things up, now waiting for results queue to drain 15500 1727096235.91161: waiting for pending results... 15500 1727096235.91432: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15500 1727096235.91489: in run() - task 0afff68d-5257-877d-2da0-000000000060 15500 1727096235.91508: variable 'ansible_search_path' from source: unknown 15500 1727096235.91512: variable 'ansible_search_path' from source: unknown 15500 1727096235.91551: calling self._execute() 15500 1727096235.91848: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096235.91852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096235.91855: variable 'omit' from source: magic vars 15500 1727096235.92278: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.92282: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096235.92284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096235.94644: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096235.94720: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096235.94754: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096235.94786: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096235.94810: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096235.94896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096235.94942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096235.94969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096235.95011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096235.95024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096235.95272: variable 'ansible_distribution_major_version' from source: facts 15500 1727096235.95275: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15500 1727096235.95277: when evaluation is False, skipping this task 15500 1727096235.95279: _execute() done 15500 1727096235.95281: dumping result to json 15500 1727096235.95282: done dumping result, returning 15500 1727096235.95285: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-877d-2da0-000000000060] 15500 1727096235.95287: sending task result for task 0afff68d-5257-877d-2da0-000000000060 15500 1727096235.95347: done sending task result for task 0afff68d-5257-877d-2da0-000000000060 15500 1727096235.95350: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15500 1727096235.95411: no more pending results, returning what we have 15500 1727096235.95415: results queue empty 15500 1727096235.95415: checking for any_errors_fatal 15500 1727096235.95423: done checking for any_errors_fatal 15500 1727096235.95424: checking for max_fail_percentage 15500 1727096235.95426: done checking for max_fail_percentage 15500 1727096235.95427: checking to see if all hosts have failed and the running result is not ok 15500 1727096235.95428: done checking to see if all hosts have failed 15500 1727096235.95429: getting the remaining hosts for this loop 15500 1727096235.95430: done getting the remaining hosts for this loop 15500 1727096235.95435: getting the next task for host managed_node1 15500 1727096235.95440: done getting next task for host managed_node1 15500 1727096235.95444: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15500 1727096235.95446: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096235.95458: getting variables 15500 1727096235.95460: in VariableManager get_vars() 15500 1727096235.95502: Calling all_inventory to load vars for managed_node1 15500 1727096235.95505: Calling groups_inventory to load vars for managed_node1 15500 1727096235.95507: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096235.95518: Calling all_plugins_play to load vars for managed_node1 15500 1727096235.95521: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096235.95524: Calling groups_plugins_play to load vars for managed_node1 15500 1727096235.97315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096235.99126: done with get_vars() 15500 1727096235.99160: done getting variables 15500 1727096235.99334: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:57:15 -0400 (0:00:00.087) 0:00:36.036 ****** 15500 1727096235.99368: entering _queue_task() for managed_node1/fail 15500 1727096236.00132: worker is 1 (out of 1 available) 15500 1727096236.00146: exiting _queue_task() for managed_node1/fail 15500 1727096236.00276: done queuing things up, now waiting for results queue to drain 15500 1727096236.00278: waiting for pending results... 15500 1727096236.01290: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15500 1727096236.01296: in run() - task 0afff68d-5257-877d-2da0-000000000061 15500 1727096236.01300: variable 'ansible_search_path' from source: unknown 15500 1727096236.01303: variable 'ansible_search_path' from source: unknown 15500 1727096236.01306: calling self._execute() 15500 1727096236.01309: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.01313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.01315: variable 'omit' from source: magic vars 15500 1727096236.01602: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.01762: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096236.01766: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096236.01934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096236.04337: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096236.04412: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096236.04449: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096236.04493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096236.04519: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096236.04607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.04651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.04678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.04726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.04740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.04787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.04819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.04843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.04882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.04896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.05172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.05176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.05179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.05181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.05183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.05290: variable 'network_connections' from source: play vars 15500 1727096236.05294: variable 'profile' from source: play vars 15500 1727096236.05389: variable 'profile' from source: play vars 15500 1727096236.05393: variable 'interface' from source: set_fact 15500 1727096236.05457: variable 'interface' from source: set_fact 15500 1727096236.05649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096236.06157: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096236.06199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096236.06231: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096236.06261: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096236.06416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096236.06437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096236.06463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.06487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096236.06654: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096236.07211: variable 'network_connections' from source: play vars 15500 1727096236.07215: variable 'profile' from source: play vars 15500 1727096236.07473: variable 'profile' from source: play vars 15500 1727096236.07477: variable 'interface' from source: set_fact 15500 1727096236.07479: variable 'interface' from source: set_fact 15500 1727096236.07494: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096236.07497: when evaluation is False, skipping this task 15500 1727096236.07499: _execute() done 15500 1727096236.07502: dumping result to json 15500 1727096236.07504: done dumping result, returning 15500 1727096236.07516: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-877d-2da0-000000000061] 15500 1727096236.07527: sending task result for task 0afff68d-5257-877d-2da0-000000000061 15500 1727096236.07616: done sending task result for task 0afff68d-5257-877d-2da0-000000000061 15500 1727096236.07619: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096236.07674: no more pending results, returning what we have 15500 1727096236.07677: results queue empty 15500 1727096236.07678: checking for any_errors_fatal 15500 1727096236.07689: done checking for any_errors_fatal 15500 1727096236.07690: checking for max_fail_percentage 15500 1727096236.07692: done checking for max_fail_percentage 15500 1727096236.07693: checking to see if all hosts have failed and the running result is not ok 15500 1727096236.07694: done checking to see if all hosts have failed 15500 1727096236.07694: getting the remaining hosts for this loop 15500 1727096236.07696: done getting the remaining hosts for this loop 15500 1727096236.07700: getting the next task for host managed_node1 15500 1727096236.07708: done getting next task for host managed_node1 15500 1727096236.07712: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15500 1727096236.07714: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096236.07727: getting variables 15500 1727096236.07729: in VariableManager get_vars() 15500 1727096236.07772: Calling all_inventory to load vars for managed_node1 15500 1727096236.07775: Calling groups_inventory to load vars for managed_node1 15500 1727096236.07778: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096236.07790: Calling all_plugins_play to load vars for managed_node1 15500 1727096236.07911: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096236.07917: Calling groups_plugins_play to load vars for managed_node1 15500 1727096236.09561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096236.11241: done with get_vars() 15500 1727096236.11284: done getting variables 15500 1727096236.11346: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:57:16 -0400 (0:00:00.120) 0:00:36.156 ****** 15500 1727096236.11388: entering _queue_task() for managed_node1/package 15500 1727096236.11772: worker is 1 (out of 1 available) 15500 1727096236.11785: exiting _queue_task() for managed_node1/package 15500 1727096236.11798: done queuing things up, now waiting for results queue to drain 15500 1727096236.11799: waiting for pending results... 15500 1727096236.12348: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15500 1727096236.12776: in run() - task 0afff68d-5257-877d-2da0-000000000062 15500 1727096236.12781: variable 'ansible_search_path' from source: unknown 15500 1727096236.12783: variable 'ansible_search_path' from source: unknown 15500 1727096236.12787: calling self._execute() 15500 1727096236.12957: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.12964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.12976: variable 'omit' from source: magic vars 15500 1727096236.13795: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.13806: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096236.14200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096236.14774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096236.14778: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096236.14780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096236.14783: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096236.14923: variable 'network_packages' from source: role '' defaults 15500 1727096236.15224: variable '__network_provider_setup' from source: role '' defaults 15500 1727096236.15228: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096236.15231: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096236.15235: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096236.15237: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096236.15436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096236.18087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096236.18146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096236.18188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096236.18224: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096236.18250: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096236.18338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.18365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.18397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.18440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.18455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.18506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.18534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.18560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.18596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.18617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.18862: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15500 1727096236.19196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.19199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.19202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.19204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.19206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.19208: variable 'ansible_python' from source: facts 15500 1727096236.19230: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15500 1727096236.19321: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096236.19417: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096236.19577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.19610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.19634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.19673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.19686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.19740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.19764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.19787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.19836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.19850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.19998: variable 'network_connections' from source: play vars 15500 1727096236.20003: variable 'profile' from source: play vars 15500 1727096236.20109: variable 'profile' from source: play vars 15500 1727096236.20116: variable 'interface' from source: set_fact 15500 1727096236.20184: variable 'interface' from source: set_fact 15500 1727096236.20264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096236.20290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096236.20328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.20356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096236.20401: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096236.20697: variable 'network_connections' from source: play vars 15500 1727096236.20701: variable 'profile' from source: play vars 15500 1727096236.20807: variable 'profile' from source: play vars 15500 1727096236.20813: variable 'interface' from source: set_fact 15500 1727096236.20887: variable 'interface' from source: set_fact 15500 1727096236.20921: variable '__network_packages_default_wireless' from source: role '' defaults 15500 1727096236.21026: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096236.21349: variable 'network_connections' from source: play vars 15500 1727096236.21353: variable 'profile' from source: play vars 15500 1727096236.21475: variable 'profile' from source: play vars 15500 1727096236.21478: variable 'interface' from source: set_fact 15500 1727096236.21532: variable 'interface' from source: set_fact 15500 1727096236.21561: variable '__network_packages_default_team' from source: role '' defaults 15500 1727096236.21642: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096236.21988: variable 'network_connections' from source: play vars 15500 1727096236.21991: variable 'profile' from source: play vars 15500 1727096236.22163: variable 'profile' from source: play vars 15500 1727096236.22169: variable 'interface' from source: set_fact 15500 1727096236.22391: variable 'interface' from source: set_fact 15500 1727096236.22396: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096236.22399: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096236.22402: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096236.22464: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096236.23209: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15500 1727096236.23963: variable 'network_connections' from source: play vars 15500 1727096236.23967: variable 'profile' from source: play vars 15500 1727096236.24031: variable 'profile' from source: play vars 15500 1727096236.24034: variable 'interface' from source: set_fact 15500 1727096236.24107: variable 'interface' from source: set_fact 15500 1727096236.24118: variable 'ansible_distribution' from source: facts 15500 1727096236.24132: variable '__network_rh_distros' from source: role '' defaults 15500 1727096236.24138: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.24163: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15500 1727096236.24336: variable 'ansible_distribution' from source: facts 15500 1727096236.24340: variable '__network_rh_distros' from source: role '' defaults 15500 1727096236.24345: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.24361: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15500 1727096236.24533: variable 'ansible_distribution' from source: facts 15500 1727096236.24536: variable '__network_rh_distros' from source: role '' defaults 15500 1727096236.24672: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.24676: variable 'network_provider' from source: set_fact 15500 1727096236.24678: variable 'ansible_facts' from source: unknown 15500 1727096236.25315: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15500 1727096236.25319: when evaluation is False, skipping this task 15500 1727096236.25322: _execute() done 15500 1727096236.25324: dumping result to json 15500 1727096236.25326: done dumping result, returning 15500 1727096236.25334: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-877d-2da0-000000000062] 15500 1727096236.25339: sending task result for task 0afff68d-5257-877d-2da0-000000000062 15500 1727096236.25443: done sending task result for task 0afff68d-5257-877d-2da0-000000000062 15500 1727096236.25445: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15500 1727096236.25500: no more pending results, returning what we have 15500 1727096236.25503: results queue empty 15500 1727096236.25504: checking for any_errors_fatal 15500 1727096236.25512: done checking for any_errors_fatal 15500 1727096236.25513: checking for max_fail_percentage 15500 1727096236.25515: done checking for max_fail_percentage 15500 1727096236.25516: checking to see if all hosts have failed and the running result is not ok 15500 1727096236.25517: done checking to see if all hosts have failed 15500 1727096236.25517: getting the remaining hosts for this loop 15500 1727096236.25519: done getting the remaining hosts for this loop 15500 1727096236.25523: getting the next task for host managed_node1 15500 1727096236.25529: done getting next task for host managed_node1 15500 1727096236.25533: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15500 1727096236.25535: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096236.25549: getting variables 15500 1727096236.25551: in VariableManager get_vars() 15500 1727096236.25595: Calling all_inventory to load vars for managed_node1 15500 1727096236.25598: Calling groups_inventory to load vars for managed_node1 15500 1727096236.25601: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096236.25617: Calling all_plugins_play to load vars for managed_node1 15500 1727096236.25620: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096236.25623: Calling groups_plugins_play to load vars for managed_node1 15500 1727096236.27652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096236.29354: done with get_vars() 15500 1727096236.29397: done getting variables 15500 1727096236.29463: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:57:16 -0400 (0:00:00.181) 0:00:36.338 ****** 15500 1727096236.29506: entering _queue_task() for managed_node1/package 15500 1727096236.29941: worker is 1 (out of 1 available) 15500 1727096236.29956: exiting _queue_task() for managed_node1/package 15500 1727096236.29971: done queuing things up, now waiting for results queue to drain 15500 1727096236.29973: waiting for pending results... 15500 1727096236.30238: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15500 1727096236.30675: in run() - task 0afff68d-5257-877d-2da0-000000000063 15500 1727096236.30681: variable 'ansible_search_path' from source: unknown 15500 1727096236.30685: variable 'ansible_search_path' from source: unknown 15500 1727096236.30689: calling self._execute() 15500 1727096236.30693: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.30695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.30698: variable 'omit' from source: magic vars 15500 1727096236.30983: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.30994: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096236.31473: variable 'network_state' from source: role '' defaults 15500 1727096236.31478: Evaluated conditional (network_state != {}): False 15500 1727096236.31482: when evaluation is False, skipping this task 15500 1727096236.31495: _execute() done 15500 1727096236.31499: dumping result to json 15500 1727096236.31501: done dumping result, returning 15500 1727096236.31511: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-877d-2da0-000000000063] 15500 1727096236.31514: sending task result for task 0afff68d-5257-877d-2da0-000000000063 15500 1727096236.31629: done sending task result for task 0afff68d-5257-877d-2da0-000000000063 15500 1727096236.31633: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096236.31685: no more pending results, returning what we have 15500 1727096236.31689: results queue empty 15500 1727096236.31690: checking for any_errors_fatal 15500 1727096236.31697: done checking for any_errors_fatal 15500 1727096236.31697: checking for max_fail_percentage 15500 1727096236.31699: done checking for max_fail_percentage 15500 1727096236.31700: checking to see if all hosts have failed and the running result is not ok 15500 1727096236.31701: done checking to see if all hosts have failed 15500 1727096236.31702: getting the remaining hosts for this loop 15500 1727096236.31703: done getting the remaining hosts for this loop 15500 1727096236.31707: getting the next task for host managed_node1 15500 1727096236.31828: done getting next task for host managed_node1 15500 1727096236.31832: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15500 1727096236.31836: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096236.31851: getting variables 15500 1727096236.31853: in VariableManager get_vars() 15500 1727096236.31895: Calling all_inventory to load vars for managed_node1 15500 1727096236.31898: Calling groups_inventory to load vars for managed_node1 15500 1727096236.31901: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096236.31914: Calling all_plugins_play to load vars for managed_node1 15500 1727096236.31917: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096236.31920: Calling groups_plugins_play to load vars for managed_node1 15500 1727096236.34204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096236.37531: done with get_vars() 15500 1727096236.37775: done getting variables 15500 1727096236.37835: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:57:16 -0400 (0:00:00.083) 0:00:36.421 ****** 15500 1727096236.37880: entering _queue_task() for managed_node1/package 15500 1727096236.38761: worker is 1 (out of 1 available) 15500 1727096236.38775: exiting _queue_task() for managed_node1/package 15500 1727096236.38787: done queuing things up, now waiting for results queue to drain 15500 1727096236.38789: waiting for pending results... 15500 1727096236.39241: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15500 1727096236.39490: in run() - task 0afff68d-5257-877d-2da0-000000000064 15500 1727096236.39503: variable 'ansible_search_path' from source: unknown 15500 1727096236.39580: variable 'ansible_search_path' from source: unknown 15500 1727096236.39584: calling self._execute() 15500 1727096236.40260: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.40265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.40270: variable 'omit' from source: magic vars 15500 1727096236.40749: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.40764: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096236.41111: variable 'network_state' from source: role '' defaults 15500 1727096236.41131: Evaluated conditional (network_state != {}): False 15500 1727096236.41135: when evaluation is False, skipping this task 15500 1727096236.41138: _execute() done 15500 1727096236.41141: dumping result to json 15500 1727096236.41142: done dumping result, returning 15500 1727096236.41150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-877d-2da0-000000000064] 15500 1727096236.41351: sending task result for task 0afff68d-5257-877d-2da0-000000000064 15500 1727096236.41424: done sending task result for task 0afff68d-5257-877d-2da0-000000000064 15500 1727096236.41428: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096236.41487: no more pending results, returning what we have 15500 1727096236.41492: results queue empty 15500 1727096236.41492: checking for any_errors_fatal 15500 1727096236.41504: done checking for any_errors_fatal 15500 1727096236.41505: checking for max_fail_percentage 15500 1727096236.41508: done checking for max_fail_percentage 15500 1727096236.41509: checking to see if all hosts have failed and the running result is not ok 15500 1727096236.41510: done checking to see if all hosts have failed 15500 1727096236.41511: getting the remaining hosts for this loop 15500 1727096236.41513: done getting the remaining hosts for this loop 15500 1727096236.41517: getting the next task for host managed_node1 15500 1727096236.41595: done getting next task for host managed_node1 15500 1727096236.41600: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15500 1727096236.41603: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096236.41619: getting variables 15500 1727096236.41622: in VariableManager get_vars() 15500 1727096236.41840: Calling all_inventory to load vars for managed_node1 15500 1727096236.41845: Calling groups_inventory to load vars for managed_node1 15500 1727096236.41907: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096236.41921: Calling all_plugins_play to load vars for managed_node1 15500 1727096236.41925: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096236.41928: Calling groups_plugins_play to load vars for managed_node1 15500 1727096236.43877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096236.46454: done with get_vars() 15500 1727096236.46494: done getting variables 15500 1727096236.46571: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:57:16 -0400 (0:00:00.087) 0:00:36.509 ****** 15500 1727096236.46604: entering _queue_task() for managed_node1/service 15500 1727096236.47084: worker is 1 (out of 1 available) 15500 1727096236.47102: exiting _queue_task() for managed_node1/service 15500 1727096236.47117: done queuing things up, now waiting for results queue to drain 15500 1727096236.47119: waiting for pending results... 15500 1727096236.47562: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15500 1727096236.47574: in run() - task 0afff68d-5257-877d-2da0-000000000065 15500 1727096236.47581: variable 'ansible_search_path' from source: unknown 15500 1727096236.47584: variable 'ansible_search_path' from source: unknown 15500 1727096236.47587: calling self._execute() 15500 1727096236.47871: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.47876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.47880: variable 'omit' from source: magic vars 15500 1727096236.48843: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.48848: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096236.48937: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096236.49366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096236.52714: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096236.52908: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096236.52944: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096236.53111: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096236.53117: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096236.53209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.53249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.53302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.53655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.53661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.53664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.53669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.53766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.53771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.53974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.53980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.53983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.53985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.54199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.54319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.54606: variable 'network_connections' from source: play vars 15500 1727096236.54689: variable 'profile' from source: play vars 15500 1727096236.55073: variable 'profile' from source: play vars 15500 1727096236.55077: variable 'interface' from source: set_fact 15500 1727096236.55079: variable 'interface' from source: set_fact 15500 1727096236.55374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096236.55377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096236.55429: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096236.55463: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096236.55490: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096236.55545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096236.55566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096236.55590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.55617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096236.55672: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096236.56072: variable 'network_connections' from source: play vars 15500 1727096236.56075: variable 'profile' from source: play vars 15500 1727096236.56078: variable 'profile' from source: play vars 15500 1727096236.56081: variable 'interface' from source: set_fact 15500 1727096236.56083: variable 'interface' from source: set_fact 15500 1727096236.56099: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15500 1727096236.56102: when evaluation is False, skipping this task 15500 1727096236.56104: _execute() done 15500 1727096236.56106: dumping result to json 15500 1727096236.56108: done dumping result, returning 15500 1727096236.56113: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-877d-2da0-000000000065] 15500 1727096236.56123: sending task result for task 0afff68d-5257-877d-2da0-000000000065 15500 1727096236.56213: done sending task result for task 0afff68d-5257-877d-2da0-000000000065 15500 1727096236.56216: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15500 1727096236.56302: no more pending results, returning what we have 15500 1727096236.56305: results queue empty 15500 1727096236.56306: checking for any_errors_fatal 15500 1727096236.56311: done checking for any_errors_fatal 15500 1727096236.56312: checking for max_fail_percentage 15500 1727096236.56314: done checking for max_fail_percentage 15500 1727096236.56315: checking to see if all hosts have failed and the running result is not ok 15500 1727096236.56316: done checking to see if all hosts have failed 15500 1727096236.56316: getting the remaining hosts for this loop 15500 1727096236.56318: done getting the remaining hosts for this loop 15500 1727096236.56322: getting the next task for host managed_node1 15500 1727096236.56329: done getting next task for host managed_node1 15500 1727096236.56332: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15500 1727096236.56334: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096236.56348: getting variables 15500 1727096236.56350: in VariableManager get_vars() 15500 1727096236.56393: Calling all_inventory to load vars for managed_node1 15500 1727096236.56396: Calling groups_inventory to load vars for managed_node1 15500 1727096236.56398: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096236.56409: Calling all_plugins_play to load vars for managed_node1 15500 1727096236.56412: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096236.56414: Calling groups_plugins_play to load vars for managed_node1 15500 1727096236.59171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096236.61048: done with get_vars() 15500 1727096236.61086: done getting variables 15500 1727096236.61156: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:57:16 -0400 (0:00:00.145) 0:00:36.655 ****** 15500 1727096236.61196: entering _queue_task() for managed_node1/service 15500 1727096236.61629: worker is 1 (out of 1 available) 15500 1727096236.61643: exiting _queue_task() for managed_node1/service 15500 1727096236.61656: done queuing things up, now waiting for results queue to drain 15500 1727096236.61657: waiting for pending results... 15500 1727096236.61946: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15500 1727096236.62276: in run() - task 0afff68d-5257-877d-2da0-000000000066 15500 1727096236.62281: variable 'ansible_search_path' from source: unknown 15500 1727096236.62283: variable 'ansible_search_path' from source: unknown 15500 1727096236.62286: calling self._execute() 15500 1727096236.62289: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.62291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.62294: variable 'omit' from source: magic vars 15500 1727096236.62680: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.62773: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096236.63070: variable 'network_provider' from source: set_fact 15500 1727096236.63077: variable 'network_state' from source: role '' defaults 15500 1727096236.63112: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15500 1727096236.63120: variable 'omit' from source: magic vars 15500 1727096236.63164: variable 'omit' from source: magic vars 15500 1727096236.63302: variable 'network_service_name' from source: role '' defaults 15500 1727096236.63398: variable 'network_service_name' from source: role '' defaults 15500 1727096236.63704: variable '__network_provider_setup' from source: role '' defaults 15500 1727096236.63710: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096236.63892: variable '__network_service_name_default_nm' from source: role '' defaults 15500 1727096236.63902: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096236.64078: variable '__network_packages_default_nm' from source: role '' defaults 15500 1727096236.64539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096236.67608: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096236.67685: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096236.67722: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096236.67766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096236.67804: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096236.67894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.67925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.67949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.67999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.68014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.68060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.68081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.68116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.68173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.68176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.68573: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15500 1727096236.68783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.68786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.68788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.68791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.68799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.68895: variable 'ansible_python' from source: facts 15500 1727096236.68926: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15500 1727096236.69272: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096236.69278: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096236.69280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.69307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.69330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.69408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.69426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.69481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096236.69514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096236.69540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.69581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096236.69606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096236.69753: variable 'network_connections' from source: play vars 15500 1727096236.69904: variable 'profile' from source: play vars 15500 1727096236.69909: variable 'profile' from source: play vars 15500 1727096236.69913: variable 'interface' from source: set_fact 15500 1727096236.70072: variable 'interface' from source: set_fact 15500 1727096236.70084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096236.70308: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096236.70410: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096236.70470: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096236.70518: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096236.70625: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096236.70628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096236.70871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096236.70876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096236.70878: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096236.71302: variable 'network_connections' from source: play vars 15500 1727096236.71308: variable 'profile' from source: play vars 15500 1727096236.71535: variable 'profile' from source: play vars 15500 1727096236.71541: variable 'interface' from source: set_fact 15500 1727096236.71712: variable 'interface' from source: set_fact 15500 1727096236.71749: variable '__network_packages_default_wireless' from source: role '' defaults 15500 1727096236.71880: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096236.72666: variable 'network_connections' from source: play vars 15500 1727096236.72687: variable 'profile' from source: play vars 15500 1727096236.72776: variable 'profile' from source: play vars 15500 1727096236.72792: variable 'interface' from source: set_fact 15500 1727096236.72879: variable 'interface' from source: set_fact 15500 1727096236.72918: variable '__network_packages_default_team' from source: role '' defaults 15500 1727096236.73014: variable '__network_team_connections_defined' from source: role '' defaults 15500 1727096236.73346: variable 'network_connections' from source: play vars 15500 1727096236.73357: variable 'profile' from source: play vars 15500 1727096236.73442: variable 'profile' from source: play vars 15500 1727096236.73453: variable 'interface' from source: set_fact 15500 1727096236.73536: variable 'interface' from source: set_fact 15500 1727096236.73616: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096236.73687: variable '__network_service_name_default_initscripts' from source: role '' defaults 15500 1727096236.73700: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096236.73773: variable '__network_packages_default_initscripts' from source: role '' defaults 15500 1727096236.74012: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15500 1727096236.74699: variable 'network_connections' from source: play vars 15500 1727096236.74709: variable 'profile' from source: play vars 15500 1727096236.74874: variable 'profile' from source: play vars 15500 1727096236.74877: variable 'interface' from source: set_fact 15500 1727096236.74883: variable 'interface' from source: set_fact 15500 1727096236.74886: variable 'ansible_distribution' from source: facts 15500 1727096236.74888: variable '__network_rh_distros' from source: role '' defaults 15500 1727096236.74890: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.74905: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15500 1727096236.75173: variable 'ansible_distribution' from source: facts 15500 1727096236.75281: variable '__network_rh_distros' from source: role '' defaults 15500 1727096236.75292: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.75429: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15500 1727096236.75725: variable 'ansible_distribution' from source: facts 15500 1727096236.75780: variable '__network_rh_distros' from source: role '' defaults 15500 1727096236.75791: variable 'ansible_distribution_major_version' from source: facts 15500 1727096236.75910: variable 'network_provider' from source: set_fact 15500 1727096236.75940: variable 'omit' from source: magic vars 15500 1727096236.76174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096236.76178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096236.76180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096236.76182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096236.76185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096236.76373: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096236.76376: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.76378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.76626: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096236.76629: Set connection var ansible_pipelining to False 15500 1727096236.76631: Set connection var ansible_timeout to 10 15500 1727096236.76633: Set connection var ansible_shell_type to sh 15500 1727096236.76636: Set connection var ansible_shell_executable to /bin/sh 15500 1727096236.76638: Set connection var ansible_connection to ssh 15500 1727096236.76640: variable 'ansible_shell_executable' from source: unknown 15500 1727096236.76642: variable 'ansible_connection' from source: unknown 15500 1727096236.76644: variable 'ansible_module_compression' from source: unknown 15500 1727096236.76646: variable 'ansible_shell_type' from source: unknown 15500 1727096236.76648: variable 'ansible_shell_executable' from source: unknown 15500 1727096236.76650: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096236.76657: variable 'ansible_pipelining' from source: unknown 15500 1727096236.76661: variable 'ansible_timeout' from source: unknown 15500 1727096236.76663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096236.76982: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096236.77014: variable 'omit' from source: magic vars 15500 1727096236.77027: starting attempt loop 15500 1727096236.77034: running the handler 15500 1727096236.77201: variable 'ansible_facts' from source: unknown 15500 1727096236.78044: _low_level_execute_command(): starting 15500 1727096236.78060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096236.78792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096236.78873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096236.78889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096236.78932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096236.78952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096236.78975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096236.79116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096236.80852: stdout chunk (state=3): >>>/root <<< 15500 1727096236.81192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096236.81196: stderr chunk (state=3): >>><<< 15500 1727096236.81198: stdout chunk (state=3): >>><<< 15500 1727096236.81201: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096236.81204: _low_level_execute_command(): starting 15500 1727096236.81209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531 `" && echo ansible-tmp-1727096236.811545-17032-259916752195531="` echo /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531 `" ) && sleep 0' 15500 1727096236.82105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096236.82129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096236.82160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096236.82196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096236.82254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096236.82580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096236.82693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096236.82805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096236.84791: stdout chunk (state=3): >>>ansible-tmp-1727096236.811545-17032-259916752195531=/root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531 <<< 15500 1727096236.84934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096236.84946: stdout chunk (state=3): >>><<< 15500 1727096236.84957: stderr chunk (state=3): >>><<< 15500 1727096236.84984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096236.811545-17032-259916752195531=/root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096236.85029: variable 'ansible_module_compression' from source: unknown 15500 1727096236.85104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15500 1727096236.85171: variable 'ansible_facts' from source: unknown 15500 1727096236.85388: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/AnsiballZ_systemd.py 15500 1727096236.85679: Sending initial data 15500 1727096236.85682: Sent initial data (155 bytes) 15500 1727096236.86318: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096236.86336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096236.86438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096236.86657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096236.86739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096236.88373: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096236.88384: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096236.88439: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096236.88506: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpg40gse15 /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/AnsiballZ_systemd.py <<< 15500 1727096236.88509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/AnsiballZ_systemd.py" <<< 15500 1727096236.88573: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpg40gse15" to remote "/root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/AnsiballZ_systemd.py" <<< 15500 1727096236.88577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/AnsiballZ_systemd.py" <<< 15500 1727096236.90081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096236.90088: stderr chunk (state=3): >>><<< 15500 1727096236.90091: stdout chunk (state=3): >>><<< 15500 1727096236.90093: done transferring module to remote 15500 1727096236.90095: _low_level_execute_command(): starting 15500 1727096236.90098: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/ /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/AnsiballZ_systemd.py && sleep 0' 15500 1727096236.90996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096236.91120: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096236.91287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096236.91374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096236.93192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096236.93227: stderr chunk (state=3): >>><<< 15500 1727096236.93230: stdout chunk (state=3): >>><<< 15500 1727096236.93242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096236.93250: _low_level_execute_command(): starting 15500 1727096236.93262: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/AnsiballZ_systemd.py && sleep 0' 15500 1727096236.93727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096236.93731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096236.93733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096236.93735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096236.93794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096236.93801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096236.93803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096236.93874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096237.23531: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10604544", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306795008", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "777949000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15500 1727096237.23539: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.<<< 15500 1727096237.23562: stdout chunk (state=3): >>>slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15500 1727096237.25479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096237.25507: stderr chunk (state=3): >>><<< 15500 1727096237.25510: stdout chunk (state=3): >>><<< 15500 1727096237.25528: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainStartTimestampMonotonic": "14125756", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ExecMainHandoffTimestampMonotonic": "14143412", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10604544", "MemoryPeak": "14716928", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306795008", "EffectiveMemoryMax": "3702857728", "EffectiveMemoryHigh": "3702857728", "CPUUsageNSec": "777949000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target cloud-init.service NetworkManager-wait-online.service network.target", "After": "system.slice dbus-broker.service systemd-journald.socket sysinit.target network-pre.target dbus.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:55:06 EDT", "StateChangeTimestampMonotonic": "260104767", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:00 EDT", "InactiveExitTimestampMonotonic": "14126240", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:01 EDT", "ActiveEnterTimestampMonotonic": "14391210", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:00 EDT", "ConditionTimestampMonotonic": "14124859", "AssertTimestamp": "Mon 2024-09-23 08:51:00 EDT", "AssertTimestampMonotonic": "14124861", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "96e31adf3b0143aea7f2b03db689d56d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096237.25646: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096237.25665: _low_level_execute_command(): starting 15500 1727096237.25670: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096236.811545-17032-259916752195531/ > /dev/null 2>&1 && sleep 0' 15500 1727096237.26137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096237.26141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.26143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096237.26145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.26197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096237.26201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096237.26207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096237.26288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096237.28153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096237.28183: stderr chunk (state=3): >>><<< 15500 1727096237.28187: stdout chunk (state=3): >>><<< 15500 1727096237.28200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096237.28206: handler run complete 15500 1727096237.28244: attempt loop complete, returning result 15500 1727096237.28247: _execute() done 15500 1727096237.28249: dumping result to json 15500 1727096237.28266: done dumping result, returning 15500 1727096237.28280: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-877d-2da0-000000000066] 15500 1727096237.28283: sending task result for task 0afff68d-5257-877d-2da0-000000000066 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096237.28515: no more pending results, returning what we have 15500 1727096237.28518: results queue empty 15500 1727096237.28519: checking for any_errors_fatal 15500 1727096237.28528: done checking for any_errors_fatal 15500 1727096237.28529: checking for max_fail_percentage 15500 1727096237.28530: done checking for max_fail_percentage 15500 1727096237.28532: checking to see if all hosts have failed and the running result is not ok 15500 1727096237.28533: done checking to see if all hosts have failed 15500 1727096237.28533: getting the remaining hosts for this loop 15500 1727096237.28535: done getting the remaining hosts for this loop 15500 1727096237.28538: getting the next task for host managed_node1 15500 1727096237.28544: done getting next task for host managed_node1 15500 1727096237.28547: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15500 1727096237.28549: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096237.28558: getting variables 15500 1727096237.28560: in VariableManager get_vars() 15500 1727096237.28622: Calling all_inventory to load vars for managed_node1 15500 1727096237.28625: Calling groups_inventory to load vars for managed_node1 15500 1727096237.28628: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096237.28638: Calling all_plugins_play to load vars for managed_node1 15500 1727096237.28641: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096237.28644: Calling groups_plugins_play to load vars for managed_node1 15500 1727096237.29182: done sending task result for task 0afff68d-5257-877d-2da0-000000000066 15500 1727096237.29185: WORKER PROCESS EXITING 15500 1727096237.29475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096237.30344: done with get_vars() 15500 1727096237.30363: done getting variables 15500 1727096237.30412: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:57:17 -0400 (0:00:00.692) 0:00:37.347 ****** 15500 1727096237.30436: entering _queue_task() for managed_node1/service 15500 1727096237.30713: worker is 1 (out of 1 available) 15500 1727096237.30727: exiting _queue_task() for managed_node1/service 15500 1727096237.30739: done queuing things up, now waiting for results queue to drain 15500 1727096237.30741: waiting for pending results... 15500 1727096237.30921: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15500 1727096237.31010: in run() - task 0afff68d-5257-877d-2da0-000000000067 15500 1727096237.31017: variable 'ansible_search_path' from source: unknown 15500 1727096237.31021: variable 'ansible_search_path' from source: unknown 15500 1727096237.31051: calling self._execute() 15500 1727096237.31129: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096237.31133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096237.31142: variable 'omit' from source: magic vars 15500 1727096237.31429: variable 'ansible_distribution_major_version' from source: facts 15500 1727096237.31440: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096237.31573: variable 'network_provider' from source: set_fact 15500 1727096237.31577: Evaluated conditional (network_provider == "nm"): True 15500 1727096237.31657: variable '__network_wpa_supplicant_required' from source: role '' defaults 15500 1727096237.31974: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15500 1727096237.31978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096237.33778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096237.33822: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096237.33852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096237.33882: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096237.33902: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096237.34004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096237.34008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096237.34024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096237.34054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096237.34065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096237.34100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096237.34117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096237.34134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096237.34161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096237.34179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096237.34218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096237.34240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096237.34260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096237.34377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096237.34380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096237.34433: variable 'network_connections' from source: play vars 15500 1727096237.34445: variable 'profile' from source: play vars 15500 1727096237.34515: variable 'profile' from source: play vars 15500 1727096237.34519: variable 'interface' from source: set_fact 15500 1727096237.34561: variable 'interface' from source: set_fact 15500 1727096237.34618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15500 1727096237.34747: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15500 1727096237.34778: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15500 1727096237.34800: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15500 1727096237.34823: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15500 1727096237.34854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15500 1727096237.34873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15500 1727096237.34890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096237.34908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15500 1727096237.34947: variable '__network_wireless_connections_defined' from source: role '' defaults 15500 1727096237.35104: variable 'network_connections' from source: play vars 15500 1727096237.35107: variable 'profile' from source: play vars 15500 1727096237.35153: variable 'profile' from source: play vars 15500 1727096237.35156: variable 'interface' from source: set_fact 15500 1727096237.35202: variable 'interface' from source: set_fact 15500 1727096237.35225: Evaluated conditional (__network_wpa_supplicant_required): False 15500 1727096237.35228: when evaluation is False, skipping this task 15500 1727096237.35230: _execute() done 15500 1727096237.35249: dumping result to json 15500 1727096237.35252: done dumping result, returning 15500 1727096237.35254: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-877d-2da0-000000000067] 15500 1727096237.35256: sending task result for task 0afff68d-5257-877d-2da0-000000000067 15500 1727096237.35533: done sending task result for task 0afff68d-5257-877d-2da0-000000000067 15500 1727096237.35536: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15500 1727096237.35573: no more pending results, returning what we have 15500 1727096237.35577: results queue empty 15500 1727096237.35578: checking for any_errors_fatal 15500 1727096237.35589: done checking for any_errors_fatal 15500 1727096237.35590: checking for max_fail_percentage 15500 1727096237.35592: done checking for max_fail_percentage 15500 1727096237.35592: checking to see if all hosts have failed and the running result is not ok 15500 1727096237.35593: done checking to see if all hosts have failed 15500 1727096237.35594: getting the remaining hosts for this loop 15500 1727096237.35595: done getting the remaining hosts for this loop 15500 1727096237.35598: getting the next task for host managed_node1 15500 1727096237.35603: done getting next task for host managed_node1 15500 1727096237.35607: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15500 1727096237.35609: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096237.35622: getting variables 15500 1727096237.35623: in VariableManager get_vars() 15500 1727096237.35664: Calling all_inventory to load vars for managed_node1 15500 1727096237.35668: Calling groups_inventory to load vars for managed_node1 15500 1727096237.35672: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096237.35680: Calling all_plugins_play to load vars for managed_node1 15500 1727096237.35683: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096237.35685: Calling groups_plugins_play to load vars for managed_node1 15500 1727096237.36691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096237.37563: done with get_vars() 15500 1727096237.37582: done getting variables 15500 1727096237.37628: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:57:17 -0400 (0:00:00.072) 0:00:37.419 ****** 15500 1727096237.37649: entering _queue_task() for managed_node1/service 15500 1727096237.37909: worker is 1 (out of 1 available) 15500 1727096237.37922: exiting _queue_task() for managed_node1/service 15500 1727096237.37934: done queuing things up, now waiting for results queue to drain 15500 1727096237.37935: waiting for pending results... 15500 1727096237.38113: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15500 1727096237.38184: in run() - task 0afff68d-5257-877d-2da0-000000000068 15500 1727096237.38196: variable 'ansible_search_path' from source: unknown 15500 1727096237.38199: variable 'ansible_search_path' from source: unknown 15500 1727096237.38231: calling self._execute() 15500 1727096237.38313: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096237.38316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096237.38326: variable 'omit' from source: magic vars 15500 1727096237.38608: variable 'ansible_distribution_major_version' from source: facts 15500 1727096237.38618: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096237.38700: variable 'network_provider' from source: set_fact 15500 1727096237.38705: Evaluated conditional (network_provider == "initscripts"): False 15500 1727096237.38708: when evaluation is False, skipping this task 15500 1727096237.38711: _execute() done 15500 1727096237.38713: dumping result to json 15500 1727096237.38715: done dumping result, returning 15500 1727096237.38725: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-877d-2da0-000000000068] 15500 1727096237.38728: sending task result for task 0afff68d-5257-877d-2da0-000000000068 15500 1727096237.38809: done sending task result for task 0afff68d-5257-877d-2da0-000000000068 15500 1727096237.38812: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15500 1727096237.38871: no more pending results, returning what we have 15500 1727096237.38875: results queue empty 15500 1727096237.38876: checking for any_errors_fatal 15500 1727096237.38887: done checking for any_errors_fatal 15500 1727096237.38887: checking for max_fail_percentage 15500 1727096237.38889: done checking for max_fail_percentage 15500 1727096237.38890: checking to see if all hosts have failed and the running result is not ok 15500 1727096237.38891: done checking to see if all hosts have failed 15500 1727096237.38891: getting the remaining hosts for this loop 15500 1727096237.38893: done getting the remaining hosts for this loop 15500 1727096237.38896: getting the next task for host managed_node1 15500 1727096237.38903: done getting next task for host managed_node1 15500 1727096237.38906: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15500 1727096237.38909: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096237.38923: getting variables 15500 1727096237.38925: in VariableManager get_vars() 15500 1727096237.38960: Calling all_inventory to load vars for managed_node1 15500 1727096237.38963: Calling groups_inventory to load vars for managed_node1 15500 1727096237.38965: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096237.38981: Calling all_plugins_play to load vars for managed_node1 15500 1727096237.38984: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096237.38986: Calling groups_plugins_play to load vars for managed_node1 15500 1727096237.39776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096237.40685: done with get_vars() 15500 1727096237.40704: done getting variables 15500 1727096237.40750: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:57:17 -0400 (0:00:00.031) 0:00:37.450 ****** 15500 1727096237.40778: entering _queue_task() for managed_node1/copy 15500 1727096237.41042: worker is 1 (out of 1 available) 15500 1727096237.41057: exiting _queue_task() for managed_node1/copy 15500 1727096237.41073: done queuing things up, now waiting for results queue to drain 15500 1727096237.41075: waiting for pending results... 15500 1727096237.41241: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15500 1727096237.41309: in run() - task 0afff68d-5257-877d-2da0-000000000069 15500 1727096237.41321: variable 'ansible_search_path' from source: unknown 15500 1727096237.41325: variable 'ansible_search_path' from source: unknown 15500 1727096237.41353: calling self._execute() 15500 1727096237.41429: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096237.41433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096237.41443: variable 'omit' from source: magic vars 15500 1727096237.41713: variable 'ansible_distribution_major_version' from source: facts 15500 1727096237.41722: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096237.41806: variable 'network_provider' from source: set_fact 15500 1727096237.41811: Evaluated conditional (network_provider == "initscripts"): False 15500 1727096237.41814: when evaluation is False, skipping this task 15500 1727096237.41816: _execute() done 15500 1727096237.41819: dumping result to json 15500 1727096237.41821: done dumping result, returning 15500 1727096237.41830: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-877d-2da0-000000000069] 15500 1727096237.41832: sending task result for task 0afff68d-5257-877d-2da0-000000000069 15500 1727096237.41923: done sending task result for task 0afff68d-5257-877d-2da0-000000000069 15500 1727096237.41925: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15500 1727096237.42001: no more pending results, returning what we have 15500 1727096237.42004: results queue empty 15500 1727096237.42005: checking for any_errors_fatal 15500 1727096237.42010: done checking for any_errors_fatal 15500 1727096237.42011: checking for max_fail_percentage 15500 1727096237.42013: done checking for max_fail_percentage 15500 1727096237.42014: checking to see if all hosts have failed and the running result is not ok 15500 1727096237.42014: done checking to see if all hosts have failed 15500 1727096237.42015: getting the remaining hosts for this loop 15500 1727096237.42017: done getting the remaining hosts for this loop 15500 1727096237.42020: getting the next task for host managed_node1 15500 1727096237.42025: done getting next task for host managed_node1 15500 1727096237.42030: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15500 1727096237.42031: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096237.42044: getting variables 15500 1727096237.42045: in VariableManager get_vars() 15500 1727096237.42080: Calling all_inventory to load vars for managed_node1 15500 1727096237.42082: Calling groups_inventory to load vars for managed_node1 15500 1727096237.42084: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096237.42092: Calling all_plugins_play to load vars for managed_node1 15500 1727096237.42094: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096237.42097: Calling groups_plugins_play to load vars for managed_node1 15500 1727096237.46743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096237.47602: done with get_vars() 15500 1727096237.47619: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:57:17 -0400 (0:00:00.068) 0:00:37.519 ****** 15500 1727096237.47670: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15500 1727096237.47939: worker is 1 (out of 1 available) 15500 1727096237.47952: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15500 1727096237.47969: done queuing things up, now waiting for results queue to drain 15500 1727096237.47970: waiting for pending results... 15500 1727096237.48143: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15500 1727096237.48225: in run() - task 0afff68d-5257-877d-2da0-00000000006a 15500 1727096237.48237: variable 'ansible_search_path' from source: unknown 15500 1727096237.48240: variable 'ansible_search_path' from source: unknown 15500 1727096237.48271: calling self._execute() 15500 1727096237.48348: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096237.48352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096237.48364: variable 'omit' from source: magic vars 15500 1727096237.48653: variable 'ansible_distribution_major_version' from source: facts 15500 1727096237.48664: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096237.48669: variable 'omit' from source: magic vars 15500 1727096237.48698: variable 'omit' from source: magic vars 15500 1727096237.48811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15500 1727096237.50229: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15500 1727096237.50285: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15500 1727096237.50313: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15500 1727096237.50337: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15500 1727096237.50356: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15500 1727096237.50414: variable 'network_provider' from source: set_fact 15500 1727096237.50506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15500 1727096237.50524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15500 1727096237.50542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15500 1727096237.50570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15500 1727096237.50586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15500 1727096237.50634: variable 'omit' from source: magic vars 15500 1727096237.50712: variable 'omit' from source: magic vars 15500 1727096237.50781: variable 'network_connections' from source: play vars 15500 1727096237.50790: variable 'profile' from source: play vars 15500 1727096237.50836: variable 'profile' from source: play vars 15500 1727096237.50839: variable 'interface' from source: set_fact 15500 1727096237.50885: variable 'interface' from source: set_fact 15500 1727096237.50983: variable 'omit' from source: magic vars 15500 1727096237.50990: variable '__lsr_ansible_managed' from source: task vars 15500 1727096237.51035: variable '__lsr_ansible_managed' from source: task vars 15500 1727096237.51156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15500 1727096237.51303: Loaded config def from plugin (lookup/template) 15500 1727096237.51307: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15500 1727096237.51327: File lookup term: get_ansible_managed.j2 15500 1727096237.51330: variable 'ansible_search_path' from source: unknown 15500 1727096237.51333: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15500 1727096237.51348: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15500 1727096237.51362: variable 'ansible_search_path' from source: unknown 15500 1727096237.54871: variable 'ansible_managed' from source: unknown 15500 1727096237.54988: variable 'omit' from source: magic vars 15500 1727096237.55274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096237.55278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096237.55281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096237.55284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096237.55286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096237.55288: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096237.55291: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096237.55293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096237.55295: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096237.55297: Set connection var ansible_pipelining to False 15500 1727096237.55300: Set connection var ansible_timeout to 10 15500 1727096237.55302: Set connection var ansible_shell_type to sh 15500 1727096237.55304: Set connection var ansible_shell_executable to /bin/sh 15500 1727096237.55307: Set connection var ansible_connection to ssh 15500 1727096237.55308: variable 'ansible_shell_executable' from source: unknown 15500 1727096237.55310: variable 'ansible_connection' from source: unknown 15500 1727096237.55312: variable 'ansible_module_compression' from source: unknown 15500 1727096237.55315: variable 'ansible_shell_type' from source: unknown 15500 1727096237.55317: variable 'ansible_shell_executable' from source: unknown 15500 1727096237.55319: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096237.55320: variable 'ansible_pipelining' from source: unknown 15500 1727096237.55323: variable 'ansible_timeout' from source: unknown 15500 1727096237.55325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096237.55414: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096237.55426: variable 'omit' from source: magic vars 15500 1727096237.55505: starting attempt loop 15500 1727096237.55508: running the handler 15500 1727096237.55511: _low_level_execute_command(): starting 15500 1727096237.55513: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096237.56136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096237.56149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096237.56160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096237.56182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096237.56201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096237.56208: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096237.56211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.56270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096237.56274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096237.56276: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15500 1727096237.56279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096237.56281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096237.56283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096237.56289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096237.56297: stderr chunk (state=3): >>>debug2: match found <<< 15500 1727096237.56308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.56379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096237.56392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096237.56401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096237.56518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096237.58451: stdout chunk (state=3): >>>/root <<< 15500 1727096237.58676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096237.58680: stdout chunk (state=3): >>><<< 15500 1727096237.58683: stderr chunk (state=3): >>><<< 15500 1727096237.58686: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096237.58688: _low_level_execute_command(): starting 15500 1727096237.58691: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785 `" && echo ansible-tmp-1727096237.5863845-17065-82190250936785="` echo /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785 `" ) && sleep 0' 15500 1727096237.59272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096237.59276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096237.59279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.59333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096237.59340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096237.59342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096237.59408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096237.61424: stdout chunk (state=3): >>>ansible-tmp-1727096237.5863845-17065-82190250936785=/root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785 <<< 15500 1727096237.61554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096237.61565: stderr chunk (state=3): >>><<< 15500 1727096237.61570: stdout chunk (state=3): >>><<< 15500 1727096237.61583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096237.5863845-17065-82190250936785=/root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096237.61621: variable 'ansible_module_compression' from source: unknown 15500 1727096237.61662: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15500 1727096237.61685: variable 'ansible_facts' from source: unknown 15500 1727096237.61750: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/AnsiballZ_network_connections.py 15500 1727096237.61855: Sending initial data 15500 1727096237.61858: Sent initial data (167 bytes) 15500 1727096237.62307: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096237.62311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096237.62313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.62316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096237.62318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.62373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096237.62376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096237.62383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096237.62450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096237.64073: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096237.64135: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096237.64204: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpg5a62nxh /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/AnsiballZ_network_connections.py <<< 15500 1727096237.64208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/AnsiballZ_network_connections.py" <<< 15500 1727096237.64275: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpg5a62nxh" to remote "/root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/AnsiballZ_network_connections.py" <<< 15500 1727096237.64278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/AnsiballZ_network_connections.py" <<< 15500 1727096237.65053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096237.65099: stderr chunk (state=3): >>><<< 15500 1727096237.65102: stdout chunk (state=3): >>><<< 15500 1727096237.65137: done transferring module to remote 15500 1727096237.65149: _low_level_execute_command(): starting 15500 1727096237.65154: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/ /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/AnsiballZ_network_connections.py && sleep 0' 15500 1727096237.65612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096237.65616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.65618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096237.65620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096237.65622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.65673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096237.65681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096237.65684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096237.65744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096237.67605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096237.67631: stderr chunk (state=3): >>><<< 15500 1727096237.67636: stdout chunk (state=3): >>><<< 15500 1727096237.67649: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096237.67652: _low_level_execute_command(): starting 15500 1727096237.67657: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/AnsiballZ_network_connections.py && sleep 0' 15500 1727096237.68104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096237.68108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.68124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.68174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096237.68179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096237.68196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096237.68276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096237.95689: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0hmjmkm0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0hmjmkm0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/915a44a0-07cf-46e3-a30a-d66b6a6317e3: error=unknown <<< 15500 1727096237.95868: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15500 1727096237.97799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096237.97828: stderr chunk (state=3): >>><<< 15500 1727096237.97832: stdout chunk (state=3): >>><<< 15500 1727096237.97850: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0hmjmkm0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_0hmjmkm0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/915a44a0-07cf-46e3-a30a-d66b6a6317e3: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096237.97883: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096237.97891: _low_level_execute_command(): starting 15500 1727096237.97896: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096237.5863845-17065-82190250936785/ > /dev/null 2>&1 && sleep 0' 15500 1727096237.98339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096237.98343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096237.98346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.98355: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096237.98415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096237.98418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096237.98496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.00388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.00415: stderr chunk (state=3): >>><<< 15500 1727096238.00418: stdout chunk (state=3): >>><<< 15500 1727096238.00433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.00438: handler run complete 15500 1727096238.00459: attempt loop complete, returning result 15500 1727096238.00464: _execute() done 15500 1727096238.00468: dumping result to json 15500 1727096238.00473: done dumping result, returning 15500 1727096238.00482: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-877d-2da0-00000000006a] 15500 1727096238.00485: sending task result for task 0afff68d-5257-877d-2da0-00000000006a 15500 1727096238.00595: done sending task result for task 0afff68d-5257-877d-2da0-00000000006a 15500 1727096238.00598: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15500 1727096238.00690: no more pending results, returning what we have 15500 1727096238.00693: results queue empty 15500 1727096238.00694: checking for any_errors_fatal 15500 1727096238.00703: done checking for any_errors_fatal 15500 1727096238.00703: checking for max_fail_percentage 15500 1727096238.00705: done checking for max_fail_percentage 15500 1727096238.00711: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.00712: done checking to see if all hosts have failed 15500 1727096238.00712: getting the remaining hosts for this loop 15500 1727096238.00714: done getting the remaining hosts for this loop 15500 1727096238.00717: getting the next task for host managed_node1 15500 1727096238.00723: done getting next task for host managed_node1 15500 1727096238.00726: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15500 1727096238.00728: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.00736: getting variables 15500 1727096238.00738: in VariableManager get_vars() 15500 1727096238.00776: Calling all_inventory to load vars for managed_node1 15500 1727096238.00784: Calling groups_inventory to load vars for managed_node1 15500 1727096238.00787: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.00796: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.00798: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.00801: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.01607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.02583: done with get_vars() 15500 1727096238.02599: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:57:18 -0400 (0:00:00.549) 0:00:38.069 ****** 15500 1727096238.02662: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15500 1727096238.02909: worker is 1 (out of 1 available) 15500 1727096238.02922: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15500 1727096238.02935: done queuing things up, now waiting for results queue to drain 15500 1727096238.02936: waiting for pending results... 15500 1727096238.03119: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15500 1727096238.03195: in run() - task 0afff68d-5257-877d-2da0-00000000006b 15500 1727096238.03206: variable 'ansible_search_path' from source: unknown 15500 1727096238.03209: variable 'ansible_search_path' from source: unknown 15500 1727096238.03237: calling self._execute() 15500 1727096238.03319: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.03325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.03334: variable 'omit' from source: magic vars 15500 1727096238.03621: variable 'ansible_distribution_major_version' from source: facts 15500 1727096238.03629: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096238.03715: variable 'network_state' from source: role '' defaults 15500 1727096238.03724: Evaluated conditional (network_state != {}): False 15500 1727096238.03729: when evaluation is False, skipping this task 15500 1727096238.03731: _execute() done 15500 1727096238.03734: dumping result to json 15500 1727096238.03736: done dumping result, returning 15500 1727096238.03742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-877d-2da0-00000000006b] 15500 1727096238.03746: sending task result for task 0afff68d-5257-877d-2da0-00000000006b 15500 1727096238.03831: done sending task result for task 0afff68d-5257-877d-2da0-00000000006b 15500 1727096238.03835: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15500 1727096238.03886: no more pending results, returning what we have 15500 1727096238.03890: results queue empty 15500 1727096238.03891: checking for any_errors_fatal 15500 1727096238.03900: done checking for any_errors_fatal 15500 1727096238.03901: checking for max_fail_percentage 15500 1727096238.03903: done checking for max_fail_percentage 15500 1727096238.03903: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.03904: done checking to see if all hosts have failed 15500 1727096238.03905: getting the remaining hosts for this loop 15500 1727096238.03906: done getting the remaining hosts for this loop 15500 1727096238.03909: getting the next task for host managed_node1 15500 1727096238.03917: done getting next task for host managed_node1 15500 1727096238.03920: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15500 1727096238.03922: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.03936: getting variables 15500 1727096238.03937: in VariableManager get_vars() 15500 1727096238.03982: Calling all_inventory to load vars for managed_node1 15500 1727096238.03985: Calling groups_inventory to load vars for managed_node1 15500 1727096238.03987: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.03996: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.03998: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.04000: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.04786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.05659: done with get_vars() 15500 1727096238.05679: done getting variables 15500 1727096238.05723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:57:18 -0400 (0:00:00.030) 0:00:38.100 ****** 15500 1727096238.05745: entering _queue_task() for managed_node1/debug 15500 1727096238.05990: worker is 1 (out of 1 available) 15500 1727096238.06004: exiting _queue_task() for managed_node1/debug 15500 1727096238.06015: done queuing things up, now waiting for results queue to drain 15500 1727096238.06017: waiting for pending results... 15500 1727096238.06193: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15500 1727096238.06269: in run() - task 0afff68d-5257-877d-2da0-00000000006c 15500 1727096238.06281: variable 'ansible_search_path' from source: unknown 15500 1727096238.06285: variable 'ansible_search_path' from source: unknown 15500 1727096238.06313: calling self._execute() 15500 1727096238.06389: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.06395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.06404: variable 'omit' from source: magic vars 15500 1727096238.06685: variable 'ansible_distribution_major_version' from source: facts 15500 1727096238.06695: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096238.06702: variable 'omit' from source: magic vars 15500 1727096238.06732: variable 'omit' from source: magic vars 15500 1727096238.06762: variable 'omit' from source: magic vars 15500 1727096238.06797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096238.06824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096238.06841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096238.06854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.06865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.06890: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096238.06895: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.06898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.06966: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096238.06971: Set connection var ansible_pipelining to False 15500 1727096238.06976: Set connection var ansible_timeout to 10 15500 1727096238.06979: Set connection var ansible_shell_type to sh 15500 1727096238.06984: Set connection var ansible_shell_executable to /bin/sh 15500 1727096238.06989: Set connection var ansible_connection to ssh 15500 1727096238.07009: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.07012: variable 'ansible_connection' from source: unknown 15500 1727096238.07015: variable 'ansible_module_compression' from source: unknown 15500 1727096238.07018: variable 'ansible_shell_type' from source: unknown 15500 1727096238.07020: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.07022: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.07025: variable 'ansible_pipelining' from source: unknown 15500 1727096238.07027: variable 'ansible_timeout' from source: unknown 15500 1727096238.07029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.07131: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096238.07141: variable 'omit' from source: magic vars 15500 1727096238.07144: starting attempt loop 15500 1727096238.07147: running the handler 15500 1727096238.07242: variable '__network_connections_result' from source: set_fact 15500 1727096238.07284: handler run complete 15500 1727096238.07297: attempt loop complete, returning result 15500 1727096238.07300: _execute() done 15500 1727096238.07303: dumping result to json 15500 1727096238.07305: done dumping result, returning 15500 1727096238.07313: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-877d-2da0-00000000006c] 15500 1727096238.07315: sending task result for task 0afff68d-5257-877d-2da0-00000000006c 15500 1727096238.07401: done sending task result for task 0afff68d-5257-877d-2da0-00000000006c 15500 1727096238.07404: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15500 1727096238.07484: no more pending results, returning what we have 15500 1727096238.07487: results queue empty 15500 1727096238.07488: checking for any_errors_fatal 15500 1727096238.07496: done checking for any_errors_fatal 15500 1727096238.07497: checking for max_fail_percentage 15500 1727096238.07498: done checking for max_fail_percentage 15500 1727096238.07499: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.07499: done checking to see if all hosts have failed 15500 1727096238.07500: getting the remaining hosts for this loop 15500 1727096238.07501: done getting the remaining hosts for this loop 15500 1727096238.07505: getting the next task for host managed_node1 15500 1727096238.07510: done getting next task for host managed_node1 15500 1727096238.07516: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15500 1727096238.07517: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.07526: getting variables 15500 1727096238.07528: in VariableManager get_vars() 15500 1727096238.07557: Calling all_inventory to load vars for managed_node1 15500 1727096238.07562: Calling groups_inventory to load vars for managed_node1 15500 1727096238.07564: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.07573: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.07576: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.07578: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.08497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.09362: done with get_vars() 15500 1727096238.09383: done getting variables 15500 1727096238.09423: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:57:18 -0400 (0:00:00.036) 0:00:38.137 ****** 15500 1727096238.09444: entering _queue_task() for managed_node1/debug 15500 1727096238.09692: worker is 1 (out of 1 available) 15500 1727096238.09706: exiting _queue_task() for managed_node1/debug 15500 1727096238.09716: done queuing things up, now waiting for results queue to drain 15500 1727096238.09717: waiting for pending results... 15500 1727096238.09893: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15500 1727096238.09963: in run() - task 0afff68d-5257-877d-2da0-00000000006d 15500 1727096238.09974: variable 'ansible_search_path' from source: unknown 15500 1727096238.09978: variable 'ansible_search_path' from source: unknown 15500 1727096238.10006: calling self._execute() 15500 1727096238.10081: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.10085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.10095: variable 'omit' from source: magic vars 15500 1727096238.10371: variable 'ansible_distribution_major_version' from source: facts 15500 1727096238.10385: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096238.10388: variable 'omit' from source: magic vars 15500 1727096238.10416: variable 'omit' from source: magic vars 15500 1727096238.10441: variable 'omit' from source: magic vars 15500 1727096238.10475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096238.10504: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096238.10521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096238.10534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.10544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.10569: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096238.10573: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.10575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.10648: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096238.10651: Set connection var ansible_pipelining to False 15500 1727096238.10657: Set connection var ansible_timeout to 10 15500 1727096238.10662: Set connection var ansible_shell_type to sh 15500 1727096238.10665: Set connection var ansible_shell_executable to /bin/sh 15500 1727096238.10670: Set connection var ansible_connection to ssh 15500 1727096238.10686: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.10689: variable 'ansible_connection' from source: unknown 15500 1727096238.10692: variable 'ansible_module_compression' from source: unknown 15500 1727096238.10695: variable 'ansible_shell_type' from source: unknown 15500 1727096238.10697: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.10699: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.10708: variable 'ansible_pipelining' from source: unknown 15500 1727096238.10711: variable 'ansible_timeout' from source: unknown 15500 1727096238.10715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.10812: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096238.10821: variable 'omit' from source: magic vars 15500 1727096238.10829: starting attempt loop 15500 1727096238.10832: running the handler 15500 1727096238.10870: variable '__network_connections_result' from source: set_fact 15500 1727096238.10928: variable '__network_connections_result' from source: set_fact 15500 1727096238.10995: handler run complete 15500 1727096238.11011: attempt loop complete, returning result 15500 1727096238.11014: _execute() done 15500 1727096238.11016: dumping result to json 15500 1727096238.11019: done dumping result, returning 15500 1727096238.11027: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-877d-2da0-00000000006d] 15500 1727096238.11029: sending task result for task 0afff68d-5257-877d-2da0-00000000006d 15500 1727096238.11119: done sending task result for task 0afff68d-5257-877d-2da0-00000000006d 15500 1727096238.11121: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15500 1727096238.11216: no more pending results, returning what we have 15500 1727096238.11220: results queue empty 15500 1727096238.11220: checking for any_errors_fatal 15500 1727096238.11225: done checking for any_errors_fatal 15500 1727096238.11225: checking for max_fail_percentage 15500 1727096238.11227: done checking for max_fail_percentage 15500 1727096238.11228: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.11229: done checking to see if all hosts have failed 15500 1727096238.11229: getting the remaining hosts for this loop 15500 1727096238.11231: done getting the remaining hosts for this loop 15500 1727096238.11234: getting the next task for host managed_node1 15500 1727096238.11240: done getting next task for host managed_node1 15500 1727096238.11243: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15500 1727096238.11245: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.11255: getting variables 15500 1727096238.11256: in VariableManager get_vars() 15500 1727096238.11289: Calling all_inventory to load vars for managed_node1 15500 1727096238.11292: Calling groups_inventory to load vars for managed_node1 15500 1727096238.11294: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.11301: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.11304: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.11306: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.12082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.12959: done with get_vars() 15500 1727096238.12978: done getting variables 15500 1727096238.13019: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:57:18 -0400 (0:00:00.035) 0:00:38.173 ****** 15500 1727096238.13045: entering _queue_task() for managed_node1/debug 15500 1727096238.13282: worker is 1 (out of 1 available) 15500 1727096238.13296: exiting _queue_task() for managed_node1/debug 15500 1727096238.13308: done queuing things up, now waiting for results queue to drain 15500 1727096238.13310: waiting for pending results... 15500 1727096238.13481: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15500 1727096238.13548: in run() - task 0afff68d-5257-877d-2da0-00000000006e 15500 1727096238.13561: variable 'ansible_search_path' from source: unknown 15500 1727096238.13565: variable 'ansible_search_path' from source: unknown 15500 1727096238.13593: calling self._execute() 15500 1727096238.13670: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.13675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.13682: variable 'omit' from source: magic vars 15500 1727096238.13955: variable 'ansible_distribution_major_version' from source: facts 15500 1727096238.13965: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096238.14047: variable 'network_state' from source: role '' defaults 15500 1727096238.14056: Evaluated conditional (network_state != {}): False 15500 1727096238.14062: when evaluation is False, skipping this task 15500 1727096238.14065: _execute() done 15500 1727096238.14069: dumping result to json 15500 1727096238.14072: done dumping result, returning 15500 1727096238.14074: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-877d-2da0-00000000006e] 15500 1727096238.14085: sending task result for task 0afff68d-5257-877d-2da0-00000000006e 15500 1727096238.14170: done sending task result for task 0afff68d-5257-877d-2da0-00000000006e 15500 1727096238.14174: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15500 1727096238.14226: no more pending results, returning what we have 15500 1727096238.14229: results queue empty 15500 1727096238.14230: checking for any_errors_fatal 15500 1727096238.14240: done checking for any_errors_fatal 15500 1727096238.14241: checking for max_fail_percentage 15500 1727096238.14242: done checking for max_fail_percentage 15500 1727096238.14243: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.14244: done checking to see if all hosts have failed 15500 1727096238.14245: getting the remaining hosts for this loop 15500 1727096238.14246: done getting the remaining hosts for this loop 15500 1727096238.14249: getting the next task for host managed_node1 15500 1727096238.14254: done getting next task for host managed_node1 15500 1727096238.14261: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15500 1727096238.14263: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.14278: getting variables 15500 1727096238.14279: in VariableManager get_vars() 15500 1727096238.14309: Calling all_inventory to load vars for managed_node1 15500 1727096238.14311: Calling groups_inventory to load vars for managed_node1 15500 1727096238.14313: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.14321: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.14323: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.14325: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.15226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.16084: done with get_vars() 15500 1727096238.16101: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:57:18 -0400 (0:00:00.031) 0:00:38.204 ****** 15500 1727096238.16168: entering _queue_task() for managed_node1/ping 15500 1727096238.16402: worker is 1 (out of 1 available) 15500 1727096238.16415: exiting _queue_task() for managed_node1/ping 15500 1727096238.16427: done queuing things up, now waiting for results queue to drain 15500 1727096238.16429: waiting for pending results... 15500 1727096238.16599: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15500 1727096238.16674: in run() - task 0afff68d-5257-877d-2da0-00000000006f 15500 1727096238.16685: variable 'ansible_search_path' from source: unknown 15500 1727096238.16689: variable 'ansible_search_path' from source: unknown 15500 1727096238.16717: calling self._execute() 15500 1727096238.16794: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.16799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.16810: variable 'omit' from source: magic vars 15500 1727096238.17082: variable 'ansible_distribution_major_version' from source: facts 15500 1727096238.17094: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096238.17098: variable 'omit' from source: magic vars 15500 1727096238.17126: variable 'omit' from source: magic vars 15500 1727096238.17150: variable 'omit' from source: magic vars 15500 1727096238.17185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096238.17214: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096238.17232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096238.17245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.17254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.17280: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096238.17283: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.17285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.17357: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096238.17363: Set connection var ansible_pipelining to False 15500 1727096238.17366: Set connection var ansible_timeout to 10 15500 1727096238.17370: Set connection var ansible_shell_type to sh 15500 1727096238.17375: Set connection var ansible_shell_executable to /bin/sh 15500 1727096238.17380: Set connection var ansible_connection to ssh 15500 1727096238.17397: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.17400: variable 'ansible_connection' from source: unknown 15500 1727096238.17403: variable 'ansible_module_compression' from source: unknown 15500 1727096238.17405: variable 'ansible_shell_type' from source: unknown 15500 1727096238.17408: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.17411: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.17414: variable 'ansible_pipelining' from source: unknown 15500 1727096238.17416: variable 'ansible_timeout' from source: unknown 15500 1727096238.17425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.17565: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096238.17576: variable 'omit' from source: magic vars 15500 1727096238.17581: starting attempt loop 15500 1727096238.17584: running the handler 15500 1727096238.17594: _low_level_execute_command(): starting 15500 1727096238.17601: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096238.18113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.18118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.18122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.18171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.18174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.18194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.18262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.19984: stdout chunk (state=3): >>>/root <<< 15500 1727096238.20083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.20112: stderr chunk (state=3): >>><<< 15500 1727096238.20116: stdout chunk (state=3): >>><<< 15500 1727096238.20141: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.20153: _low_level_execute_command(): starting 15500 1727096238.20161: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976 `" && echo ansible-tmp-1727096238.2014086-17092-281032547757976="` echo /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976 `" ) && sleep 0' 15500 1727096238.20605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.20609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.20612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.20621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096238.20624: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.20671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.20675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.20682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.20746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.22712: stdout chunk (state=3): >>>ansible-tmp-1727096238.2014086-17092-281032547757976=/root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976 <<< 15500 1727096238.22814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.22844: stderr chunk (state=3): >>><<< 15500 1727096238.22847: stdout chunk (state=3): >>><<< 15500 1727096238.22865: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096238.2014086-17092-281032547757976=/root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.22903: variable 'ansible_module_compression' from source: unknown 15500 1727096238.22938: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15500 1727096238.22967: variable 'ansible_facts' from source: unknown 15500 1727096238.23019: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/AnsiballZ_ping.py 15500 1727096238.23122: Sending initial data 15500 1727096238.23125: Sent initial data (153 bytes) 15500 1727096238.23541: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.23550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096238.23574: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.23577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.23590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found <<< 15500 1727096238.23592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.23641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.23644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.23650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.23714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.25302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15500 1727096238.25306: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096238.25365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096238.25437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp35wwhy02 /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/AnsiballZ_ping.py <<< 15500 1727096238.25440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/AnsiballZ_ping.py" <<< 15500 1727096238.25502: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp35wwhy02" to remote "/root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/AnsiballZ_ping.py" <<< 15500 1727096238.26106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.26143: stderr chunk (state=3): >>><<< 15500 1727096238.26146: stdout chunk (state=3): >>><<< 15500 1727096238.26190: done transferring module to remote 15500 1727096238.26203: _low_level_execute_command(): starting 15500 1727096238.26206: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/ /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/AnsiballZ_ping.py && sleep 0' 15500 1727096238.26640: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.26644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.26646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096238.26648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.26703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.26706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.26783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.28635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.28658: stderr chunk (state=3): >>><<< 15500 1727096238.28661: stdout chunk (state=3): >>><<< 15500 1727096238.28677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.28682: _low_level_execute_command(): starting 15500 1727096238.28685: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/AnsiballZ_ping.py && sleep 0' 15500 1727096238.29111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.29114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096238.29116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.29118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.29123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.29174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.29178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.29257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.44475: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15500 1727096238.46074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096238.46078: stdout chunk (state=3): >>><<< 15500 1727096238.46081: stderr chunk (state=3): >>><<< 15500 1727096238.46085: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096238.46087: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096238.46090: _low_level_execute_command(): starting 15500 1727096238.46095: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096238.2014086-17092-281032547757976/ > /dev/null 2>&1 && sleep 0' 15500 1727096238.46700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096238.46708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.46719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.46762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096238.46766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096238.46771: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096238.46774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.46776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096238.46778: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096238.46784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15500 1727096238.46879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.46882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.46886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.46892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.46994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.48934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.48938: stdout chunk (state=3): >>><<< 15500 1727096238.48940: stderr chunk (state=3): >>><<< 15500 1727096238.48956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.48971: handler run complete 15500 1727096238.49073: attempt loop complete, returning result 15500 1727096238.49076: _execute() done 15500 1727096238.49078: dumping result to json 15500 1727096238.49082: done dumping result, returning 15500 1727096238.49085: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-877d-2da0-00000000006f] 15500 1727096238.49087: sending task result for task 0afff68d-5257-877d-2da0-00000000006f 15500 1727096238.49145: done sending task result for task 0afff68d-5257-877d-2da0-00000000006f 15500 1727096238.49148: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15500 1727096238.49227: no more pending results, returning what we have 15500 1727096238.49230: results queue empty 15500 1727096238.49231: checking for any_errors_fatal 15500 1727096238.49236: done checking for any_errors_fatal 15500 1727096238.49237: checking for max_fail_percentage 15500 1727096238.49238: done checking for max_fail_percentage 15500 1727096238.49239: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.49240: done checking to see if all hosts have failed 15500 1727096238.49241: getting the remaining hosts for this loop 15500 1727096238.49242: done getting the remaining hosts for this loop 15500 1727096238.49246: getting the next task for host managed_node1 15500 1727096238.49253: done getting next task for host managed_node1 15500 1727096238.49255: ^ task is: TASK: meta (role_complete) 15500 1727096238.49256: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.49264: getting variables 15500 1727096238.49266: in VariableManager get_vars() 15500 1727096238.49305: Calling all_inventory to load vars for managed_node1 15500 1727096238.49308: Calling groups_inventory to load vars for managed_node1 15500 1727096238.49310: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.49318: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.49320: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.49322: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.50197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.51064: done with get_vars() 15500 1727096238.51082: done getting variables 15500 1727096238.51153: done queuing things up, now waiting for results queue to drain 15500 1727096238.51154: results queue empty 15500 1727096238.51155: checking for any_errors_fatal 15500 1727096238.51156: done checking for any_errors_fatal 15500 1727096238.51157: checking for max_fail_percentage 15500 1727096238.51158: done checking for max_fail_percentage 15500 1727096238.51158: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.51159: done checking to see if all hosts have failed 15500 1727096238.51160: getting the remaining hosts for this loop 15500 1727096238.51160: done getting the remaining hosts for this loop 15500 1727096238.51162: getting the next task for host managed_node1 15500 1727096238.51165: done getting next task for host managed_node1 15500 1727096238.51166: ^ task is: TASK: meta (flush_handlers) 15500 1727096238.51169: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.51171: getting variables 15500 1727096238.51172: in VariableManager get_vars() 15500 1727096238.51180: Calling all_inventory to load vars for managed_node1 15500 1727096238.51182: Calling groups_inventory to load vars for managed_node1 15500 1727096238.51183: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.51186: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.51188: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.51192: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.52326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.53183: done with get_vars() 15500 1727096238.53197: done getting variables 15500 1727096238.53230: in VariableManager get_vars() 15500 1727096238.53238: Calling all_inventory to load vars for managed_node1 15500 1727096238.53240: Calling groups_inventory to load vars for managed_node1 15500 1727096238.53241: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.53244: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.53245: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.53247: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.53880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.56319: done with get_vars() 15500 1727096238.56359: done queuing things up, now waiting for results queue to drain 15500 1727096238.56361: results queue empty 15500 1727096238.56362: checking for any_errors_fatal 15500 1727096238.56364: done checking for any_errors_fatal 15500 1727096238.56365: checking for max_fail_percentage 15500 1727096238.56366: done checking for max_fail_percentage 15500 1727096238.56366: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.56391: done checking to see if all hosts have failed 15500 1727096238.56393: getting the remaining hosts for this loop 15500 1727096238.56394: done getting the remaining hosts for this loop 15500 1727096238.56397: getting the next task for host managed_node1 15500 1727096238.56402: done getting next task for host managed_node1 15500 1727096238.56403: ^ task is: TASK: meta (flush_handlers) 15500 1727096238.56405: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.56413: getting variables 15500 1727096238.56414: in VariableManager get_vars() 15500 1727096238.56428: Calling all_inventory to load vars for managed_node1 15500 1727096238.56431: Calling groups_inventory to load vars for managed_node1 15500 1727096238.56433: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.56438: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.56441: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.56443: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.58548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.61960: done with get_vars() 15500 1727096238.61988: done getting variables 15500 1727096238.62041: in VariableManager get_vars() 15500 1727096238.62054: Calling all_inventory to load vars for managed_node1 15500 1727096238.62056: Calling groups_inventory to load vars for managed_node1 15500 1727096238.62058: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.62063: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.62065: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.62271: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.63564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.65106: done with get_vars() 15500 1727096238.65131: done queuing things up, now waiting for results queue to drain 15500 1727096238.65134: results queue empty 15500 1727096238.65134: checking for any_errors_fatal 15500 1727096238.65136: done checking for any_errors_fatal 15500 1727096238.65137: checking for max_fail_percentage 15500 1727096238.65138: done checking for max_fail_percentage 15500 1727096238.65138: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.65139: done checking to see if all hosts have failed 15500 1727096238.65140: getting the remaining hosts for this loop 15500 1727096238.65141: done getting the remaining hosts for this loop 15500 1727096238.65144: getting the next task for host managed_node1 15500 1727096238.65147: done getting next task for host managed_node1 15500 1727096238.65148: ^ task is: None 15500 1727096238.65149: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.65150: done queuing things up, now waiting for results queue to drain 15500 1727096238.65151: results queue empty 15500 1727096238.65152: checking for any_errors_fatal 15500 1727096238.65153: done checking for any_errors_fatal 15500 1727096238.65153: checking for max_fail_percentage 15500 1727096238.65154: done checking for max_fail_percentage 15500 1727096238.65155: checking to see if all hosts have failed and the running result is not ok 15500 1727096238.65156: done checking to see if all hosts have failed 15500 1727096238.65156: getting the next task for host managed_node1 15500 1727096238.65158: done getting next task for host managed_node1 15500 1727096238.65159: ^ task is: None 15500 1727096238.65160: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.65199: in VariableManager get_vars() 15500 1727096238.65214: done with get_vars() 15500 1727096238.65220: in VariableManager get_vars() 15500 1727096238.65228: done with get_vars() 15500 1727096238.65233: variable 'omit' from source: magic vars 15500 1727096238.65350: variable 'task' from source: play vars 15500 1727096238.65391: in VariableManager get_vars() 15500 1727096238.65402: done with get_vars() 15500 1727096238.65422: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 15500 1727096238.65653: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096238.65676: getting the remaining hosts for this loop 15500 1727096238.65678: done getting the remaining hosts for this loop 15500 1727096238.65680: getting the next task for host managed_node1 15500 1727096238.65683: done getting next task for host managed_node1 15500 1727096238.65685: ^ task is: TASK: Gathering Facts 15500 1727096238.65686: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096238.65688: getting variables 15500 1727096238.65689: in VariableManager get_vars() 15500 1727096238.65698: Calling all_inventory to load vars for managed_node1 15500 1727096238.65700: Calling groups_inventory to load vars for managed_node1 15500 1727096238.65702: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096238.65707: Calling all_plugins_play to load vars for managed_node1 15500 1727096238.65709: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096238.65712: Calling groups_plugins_play to load vars for managed_node1 15500 1727096238.67098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096238.68804: done with get_vars() 15500 1727096238.68825: done getting variables 15500 1727096238.69092: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Monday 23 September 2024 08:57:18 -0400 (0:00:00.529) 0:00:38.734 ****** 15500 1727096238.69117: entering _queue_task() for managed_node1/gather_facts 15500 1727096238.69455: worker is 1 (out of 1 available) 15500 1727096238.69466: exiting _queue_task() for managed_node1/gather_facts 15500 1727096238.69678: done queuing things up, now waiting for results queue to drain 15500 1727096238.69680: waiting for pending results... 15500 1727096238.69748: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096238.69873: in run() - task 0afff68d-5257-877d-2da0-00000000046e 15500 1727096238.69877: variable 'ansible_search_path' from source: unknown 15500 1727096238.70012: calling self._execute() 15500 1727096238.70015: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.70021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.70034: variable 'omit' from source: magic vars 15500 1727096238.70398: variable 'ansible_distribution_major_version' from source: facts 15500 1727096238.70413: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096238.70423: variable 'omit' from source: magic vars 15500 1727096238.70459: variable 'omit' from source: magic vars 15500 1727096238.70504: variable 'omit' from source: magic vars 15500 1727096238.70550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096238.70621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096238.70670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096238.70677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.70732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096238.70805: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096238.70901: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.70909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.71175: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096238.71179: Set connection var ansible_pipelining to False 15500 1727096238.71181: Set connection var ansible_timeout to 10 15500 1727096238.71184: Set connection var ansible_shell_type to sh 15500 1727096238.71185: Set connection var ansible_shell_executable to /bin/sh 15500 1727096238.71187: Set connection var ansible_connection to ssh 15500 1727096238.71189: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.71191: variable 'ansible_connection' from source: unknown 15500 1727096238.71193: variable 'ansible_module_compression' from source: unknown 15500 1727096238.71196: variable 'ansible_shell_type' from source: unknown 15500 1727096238.71198: variable 'ansible_shell_executable' from source: unknown 15500 1727096238.71200: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096238.71202: variable 'ansible_pipelining' from source: unknown 15500 1727096238.71203: variable 'ansible_timeout' from source: unknown 15500 1727096238.71205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096238.71299: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096238.71321: variable 'omit' from source: magic vars 15500 1727096238.71332: starting attempt loop 15500 1727096238.71340: running the handler 15500 1727096238.71361: variable 'ansible_facts' from source: unknown 15500 1727096238.71388: _low_level_execute_command(): starting 15500 1727096238.71400: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096238.72098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096238.72115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.72188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.72237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.72255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.72285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.72429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.74175: stdout chunk (state=3): >>>/root <<< 15500 1727096238.74336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.74339: stdout chunk (state=3): >>><<< 15500 1727096238.74341: stderr chunk (state=3): >>><<< 15500 1727096238.74459: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.74573: _low_level_execute_command(): starting 15500 1727096238.74577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134 `" && echo ansible-tmp-1727096238.7437782-17112-222276539527134="` echo /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134 `" ) && sleep 0' 15500 1727096238.75483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096238.75547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.75559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.75740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.75783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.75966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.77900: stdout chunk (state=3): >>>ansible-tmp-1727096238.7437782-17112-222276539527134=/root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134 <<< 15500 1727096238.78060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.78064: stdout chunk (state=3): >>><<< 15500 1727096238.78066: stderr chunk (state=3): >>><<< 15500 1727096238.78273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096238.7437782-17112-222276539527134=/root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.78277: variable 'ansible_module_compression' from source: unknown 15500 1727096238.78279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096238.78281: variable 'ansible_facts' from source: unknown 15500 1727096238.78466: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/AnsiballZ_setup.py 15500 1727096238.78638: Sending initial data 15500 1727096238.78647: Sent initial data (154 bytes) 15500 1727096238.79290: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096238.79394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.79420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.79437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.79461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.79594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.81192: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096238.81205: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15500 1727096238.81214: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15500 1727096238.81229: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096238.81311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096238.81403: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpm293y7ax /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/AnsiballZ_setup.py <<< 15500 1727096238.81414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/AnsiballZ_setup.py" <<< 15500 1727096238.81471: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpm293y7ax" to remote "/root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/AnsiballZ_setup.py" <<< 15500 1727096238.83480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.83484: stdout chunk (state=3): >>><<< 15500 1727096238.83487: stderr chunk (state=3): >>><<< 15500 1727096238.83489: done transferring module to remote 15500 1727096238.83491: _low_level_execute_command(): starting 15500 1727096238.83494: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/ /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/AnsiballZ_setup.py && sleep 0' 15500 1727096238.84075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096238.84089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.84130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096238.84183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.84187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.84320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096238.86170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096238.86174: stdout chunk (state=3): >>><<< 15500 1727096238.86182: stderr chunk (state=3): >>><<< 15500 1727096238.86233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096238.86346: _low_level_execute_command(): starting 15500 1727096238.86434: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/AnsiballZ_setup.py && sleep 0' 15500 1727096238.87722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096238.87726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096238.87728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15500 1727096238.87730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096238.87732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096238.87899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096238.87902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096238.88017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096239.50765: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "19", "epoch": "1727096239", "epoch_int": "1727096239", "date": "2024-09-23", "time": "08:57:19", "iso8601_micro": "2024-09-23T12:57:19.154734Z", "iso8601": "2024-09-23T12:57:19Z", "iso8601_basic": "20240923T085719154734", "iso8601_basic_short": "20240923T085719", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.46728515625, "5m": 0.33544921875, "15m": 0.1591796875}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "un<<< 15500 1727096239.50801: stdout chunk (state=3): >>>ix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2952, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 579, "free": 2952}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 392, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795561472, "block_size": 4096, "block_total": 65519099, "block_available": 63914932, "block_used": 1604167, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopbac<<< 15500 1727096239.50838: stdout chunk (state=3): >>>k", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096239.52785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096239.52809: stderr chunk (state=3): >>><<< 15500 1727096239.52818: stdout chunk (state=3): >>><<< 15500 1727096239.52864: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "19", "epoch": "1727096239", "epoch_int": "1727096239", "date": "2024-09-23", "time": "08:57:19", "iso8601_micro": "2024-09-23T12:57:19.154734Z", "iso8601": "2024-09-23T12:57:19Z", "iso8601_basic": "20240923T085719154734", "iso8601_basic_short": "20240923T085719", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.46728515625, "5m": 0.33544921875, "15m": 0.1591796875}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2952, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 579, "free": 2952}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 392, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795561472, "block_size": 4096, "block_total": 65519099, "block_available": 63914932, "block_used": 1604167, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096239.53210: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096239.53239: _low_level_execute_command(): starting 15500 1727096239.53249: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096238.7437782-17112-222276539527134/ > /dev/null 2>&1 && sleep 0' 15500 1727096239.53849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096239.53863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096239.53879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096239.53900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096239.53915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096239.53925: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096239.53937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096239.53954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096239.53966: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096239.54002: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096239.54062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096239.54081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096239.54117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096239.54298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096239.56183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096239.56197: stderr chunk (state=3): >>><<< 15500 1727096239.56206: stdout chunk (state=3): >>><<< 15500 1727096239.56228: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096239.56241: handler run complete 15500 1727096239.56366: variable 'ansible_facts' from source: unknown 15500 1727096239.56598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.56906: variable 'ansible_facts' from source: unknown 15500 1727096239.56995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.57272: attempt loop complete, returning result 15500 1727096239.57276: _execute() done 15500 1727096239.57278: dumping result to json 15500 1727096239.57280: done dumping result, returning 15500 1727096239.57282: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-00000000046e] 15500 1727096239.57284: sending task result for task 0afff68d-5257-877d-2da0-00000000046e 15500 1727096239.57730: done sending task result for task 0afff68d-5257-877d-2da0-00000000046e 15500 1727096239.57734: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096239.58209: no more pending results, returning what we have 15500 1727096239.58212: results queue empty 15500 1727096239.58213: checking for any_errors_fatal 15500 1727096239.58214: done checking for any_errors_fatal 15500 1727096239.58215: checking for max_fail_percentage 15500 1727096239.58216: done checking for max_fail_percentage 15500 1727096239.58217: checking to see if all hosts have failed and the running result is not ok 15500 1727096239.58218: done checking to see if all hosts have failed 15500 1727096239.58218: getting the remaining hosts for this loop 15500 1727096239.58219: done getting the remaining hosts for this loop 15500 1727096239.58223: getting the next task for host managed_node1 15500 1727096239.58227: done getting next task for host managed_node1 15500 1727096239.58229: ^ task is: TASK: meta (flush_handlers) 15500 1727096239.58230: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096239.58234: getting variables 15500 1727096239.58236: in VariableManager get_vars() 15500 1727096239.58255: Calling all_inventory to load vars for managed_node1 15500 1727096239.58257: Calling groups_inventory to load vars for managed_node1 15500 1727096239.58263: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096239.58393: Calling all_plugins_play to load vars for managed_node1 15500 1727096239.58397: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096239.58401: Calling groups_plugins_play to load vars for managed_node1 15500 1727096239.61272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.65712: done with get_vars() 15500 1727096239.65746: done getting variables 15500 1727096239.65821: in VariableManager get_vars() 15500 1727096239.65831: Calling all_inventory to load vars for managed_node1 15500 1727096239.65833: Calling groups_inventory to load vars for managed_node1 15500 1727096239.65837: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096239.65841: Calling all_plugins_play to load vars for managed_node1 15500 1727096239.65844: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096239.65846: Calling groups_plugins_play to load vars for managed_node1 15500 1727096239.68949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.71995: done with get_vars() 15500 1727096239.72035: done queuing things up, now waiting for results queue to drain 15500 1727096239.72038: results queue empty 15500 1727096239.72038: checking for any_errors_fatal 15500 1727096239.72042: done checking for any_errors_fatal 15500 1727096239.72043: checking for max_fail_percentage 15500 1727096239.72044: done checking for max_fail_percentage 15500 1727096239.72045: checking to see if all hosts have failed and the running result is not ok 15500 1727096239.72045: done checking to see if all hosts have failed 15500 1727096239.72046: getting the remaining hosts for this loop 15500 1727096239.72047: done getting the remaining hosts for this loop 15500 1727096239.72049: getting the next task for host managed_node1 15500 1727096239.72053: done getting next task for host managed_node1 15500 1727096239.72056: ^ task is: TASK: Include the task '{{ task }}' 15500 1727096239.72058: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096239.72060: getting variables 15500 1727096239.72061: in VariableManager get_vars() 15500 1727096239.72274: Calling all_inventory to load vars for managed_node1 15500 1727096239.72277: Calling groups_inventory to load vars for managed_node1 15500 1727096239.72280: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096239.72286: Calling all_plugins_play to load vars for managed_node1 15500 1727096239.72289: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096239.72292: Calling groups_plugins_play to load vars for managed_node1 15500 1727096239.74460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.78051: done with get_vars() 15500 1727096239.78084: done getting variables 15500 1727096239.78472: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Monday 23 September 2024 08:57:19 -0400 (0:00:01.093) 0:00:39.828 ****** 15500 1727096239.78508: entering _queue_task() for managed_node1/include_tasks 15500 1727096239.79331: worker is 1 (out of 1 available) 15500 1727096239.79344: exiting _queue_task() for managed_node1/include_tasks 15500 1727096239.79469: done queuing things up, now waiting for results queue to drain 15500 1727096239.79471: waiting for pending results... 15500 1727096239.80374: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' 15500 1727096239.80380: in run() - task 0afff68d-5257-877d-2da0-000000000073 15500 1727096239.80385: variable 'ansible_search_path' from source: unknown 15500 1727096239.80388: calling self._execute() 15500 1727096239.80390: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096239.80393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096239.80564: variable 'omit' from source: magic vars 15500 1727096239.81566: variable 'ansible_distribution_major_version' from source: facts 15500 1727096239.81679: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096239.81691: variable 'task' from source: play vars 15500 1727096239.81770: variable 'task' from source: play vars 15500 1727096239.81785: _execute() done 15500 1727096239.81793: dumping result to json 15500 1727096239.81802: done dumping result, returning 15500 1727096239.81813: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' [0afff68d-5257-877d-2da0-000000000073] 15500 1727096239.81823: sending task result for task 0afff68d-5257-877d-2da0-000000000073 15500 1727096239.81964: no more pending results, returning what we have 15500 1727096239.81972: in VariableManager get_vars() 15500 1727096239.82009: Calling all_inventory to load vars for managed_node1 15500 1727096239.82013: Calling groups_inventory to load vars for managed_node1 15500 1727096239.82017: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096239.82031: Calling all_plugins_play to load vars for managed_node1 15500 1727096239.82035: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096239.82039: Calling groups_plugins_play to load vars for managed_node1 15500 1727096239.83074: done sending task result for task 0afff68d-5257-877d-2da0-000000000073 15500 1727096239.83077: WORKER PROCESS EXITING 15500 1727096239.85116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.88057: done with get_vars() 15500 1727096239.88229: variable 'ansible_search_path' from source: unknown 15500 1727096239.88246: we have included files to process 15500 1727096239.88247: generating all_blocks data 15500 1727096239.88249: done generating all_blocks data 15500 1727096239.88250: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15500 1727096239.88251: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15500 1727096239.88254: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15500 1727096239.88419: in VariableManager get_vars() 15500 1727096239.88437: done with get_vars() 15500 1727096239.88549: done processing included file 15500 1727096239.88551: iterating over new_blocks loaded from include file 15500 1727096239.88553: in VariableManager get_vars() 15500 1727096239.88564: done with get_vars() 15500 1727096239.88566: filtering new block on tags 15500 1727096239.88588: done filtering new block on tags 15500 1727096239.88590: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 15500 1727096239.88595: extending task lists for all hosts with included blocks 15500 1727096239.88624: done extending task lists 15500 1727096239.88625: done processing included files 15500 1727096239.88626: results queue empty 15500 1727096239.88627: checking for any_errors_fatal 15500 1727096239.88629: done checking for any_errors_fatal 15500 1727096239.88629: checking for max_fail_percentage 15500 1727096239.88630: done checking for max_fail_percentage 15500 1727096239.88631: checking to see if all hosts have failed and the running result is not ok 15500 1727096239.88632: done checking to see if all hosts have failed 15500 1727096239.88633: getting the remaining hosts for this loop 15500 1727096239.88635: done getting the remaining hosts for this loop 15500 1727096239.88637: getting the next task for host managed_node1 15500 1727096239.88641: done getting next task for host managed_node1 15500 1727096239.88643: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15500 1727096239.88645: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096239.88648: getting variables 15500 1727096239.88648: in VariableManager get_vars() 15500 1727096239.88656: Calling all_inventory to load vars for managed_node1 15500 1727096239.88658: Calling groups_inventory to load vars for managed_node1 15500 1727096239.88660: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096239.88666: Calling all_plugins_play to load vars for managed_node1 15500 1727096239.88670: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096239.88673: Calling groups_plugins_play to load vars for managed_node1 15500 1727096239.89888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.92441: done with get_vars() 15500 1727096239.92464: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Monday 23 September 2024 08:57:19 -0400 (0:00:00.140) 0:00:39.968 ****** 15500 1727096239.92544: entering _queue_task() for managed_node1/include_tasks 15500 1727096239.93296: worker is 1 (out of 1 available) 15500 1727096239.93309: exiting _queue_task() for managed_node1/include_tasks 15500 1727096239.93323: done queuing things up, now waiting for results queue to drain 15500 1727096239.93324: waiting for pending results... 15500 1727096239.93744: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 15500 1727096239.93953: in run() - task 0afff68d-5257-877d-2da0-00000000047f 15500 1727096239.93972: variable 'ansible_search_path' from source: unknown 15500 1727096239.93976: variable 'ansible_search_path' from source: unknown 15500 1727096239.94088: calling self._execute() 15500 1727096239.94289: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096239.94295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096239.94306: variable 'omit' from source: magic vars 15500 1727096239.95100: variable 'ansible_distribution_major_version' from source: facts 15500 1727096239.95111: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096239.95117: _execute() done 15500 1727096239.95121: dumping result to json 15500 1727096239.95123: done dumping result, returning 15500 1727096239.95130: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-877d-2da0-00000000047f] 15500 1727096239.95135: sending task result for task 0afff68d-5257-877d-2da0-00000000047f 15500 1727096239.95229: done sending task result for task 0afff68d-5257-877d-2da0-00000000047f 15500 1727096239.95233: WORKER PROCESS EXITING 15500 1727096239.95262: no more pending results, returning what we have 15500 1727096239.95271: in VariableManager get_vars() 15500 1727096239.95308: Calling all_inventory to load vars for managed_node1 15500 1727096239.95311: Calling groups_inventory to load vars for managed_node1 15500 1727096239.95315: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096239.95329: Calling all_plugins_play to load vars for managed_node1 15500 1727096239.95332: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096239.95334: Calling groups_plugins_play to load vars for managed_node1 15500 1727096239.97118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096239.99335: done with get_vars() 15500 1727096239.99355: variable 'ansible_search_path' from source: unknown 15500 1727096239.99357: variable 'ansible_search_path' from source: unknown 15500 1727096239.99365: variable 'task' from source: play vars 15500 1727096239.99870: variable 'task' from source: play vars 15500 1727096239.99903: we have included files to process 15500 1727096239.99905: generating all_blocks data 15500 1727096239.99907: done generating all_blocks data 15500 1727096239.99908: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15500 1727096239.99909: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15500 1727096239.99912: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15500 1727096240.01590: done processing included file 15500 1727096240.01593: iterating over new_blocks loaded from include file 15500 1727096240.01595: in VariableManager get_vars() 15500 1727096240.01609: done with get_vars() 15500 1727096240.01611: filtering new block on tags 15500 1727096240.01633: done filtering new block on tags 15500 1727096240.01636: in VariableManager get_vars() 15500 1727096240.01647: done with get_vars() 15500 1727096240.01648: filtering new block on tags 15500 1727096240.01670: done filtering new block on tags 15500 1727096240.01672: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 15500 1727096240.01677: extending task lists for all hosts with included blocks 15500 1727096240.01975: done extending task lists 15500 1727096240.01976: done processing included files 15500 1727096240.01977: results queue empty 15500 1727096240.01978: checking for any_errors_fatal 15500 1727096240.01981: done checking for any_errors_fatal 15500 1727096240.01982: checking for max_fail_percentage 15500 1727096240.01983: done checking for max_fail_percentage 15500 1727096240.01984: checking to see if all hosts have failed and the running result is not ok 15500 1727096240.01985: done checking to see if all hosts have failed 15500 1727096240.01985: getting the remaining hosts for this loop 15500 1727096240.01986: done getting the remaining hosts for this loop 15500 1727096240.01989: getting the next task for host managed_node1 15500 1727096240.01993: done getting next task for host managed_node1 15500 1727096240.01995: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15500 1727096240.01998: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096240.02000: getting variables 15500 1727096240.02001: in VariableManager get_vars() 15500 1727096240.02010: Calling all_inventory to load vars for managed_node1 15500 1727096240.02012: Calling groups_inventory to load vars for managed_node1 15500 1727096240.02014: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096240.02019: Calling all_plugins_play to load vars for managed_node1 15500 1727096240.02022: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096240.02024: Calling groups_plugins_play to load vars for managed_node1 15500 1727096240.04456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096240.11057: done with get_vars() 15500 1727096240.11087: done getting variables 15500 1727096240.11131: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:57:20 -0400 (0:00:00.186) 0:00:40.154 ****** 15500 1727096240.11157: entering _queue_task() for managed_node1/set_fact 15500 1727096240.11907: worker is 1 (out of 1 available) 15500 1727096240.11919: exiting _queue_task() for managed_node1/set_fact 15500 1727096240.11932: done queuing things up, now waiting for results queue to drain 15500 1727096240.11933: waiting for pending results... 15500 1727096240.12545: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15500 1727096240.12838: in run() - task 0afff68d-5257-877d-2da0-00000000048a 15500 1727096240.12842: variable 'ansible_search_path' from source: unknown 15500 1727096240.12845: variable 'ansible_search_path' from source: unknown 15500 1727096240.12866: calling self._execute() 15500 1727096240.13085: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.13099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.13114: variable 'omit' from source: magic vars 15500 1727096240.14673: variable 'ansible_distribution_major_version' from source: facts 15500 1727096240.14677: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096240.14680: variable 'omit' from source: magic vars 15500 1727096240.14683: variable 'omit' from source: magic vars 15500 1727096240.14685: variable 'omit' from source: magic vars 15500 1727096240.14687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096240.14798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096240.14825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096240.14889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096240.14902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096240.14931: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096240.15173: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.15176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.15181: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096240.15191: Set connection var ansible_pipelining to False 15500 1727096240.15201: Set connection var ansible_timeout to 10 15500 1727096240.15207: Set connection var ansible_shell_type to sh 15500 1727096240.15215: Set connection var ansible_shell_executable to /bin/sh 15500 1727096240.15281: Set connection var ansible_connection to ssh 15500 1727096240.15304: variable 'ansible_shell_executable' from source: unknown 15500 1727096240.15310: variable 'ansible_connection' from source: unknown 15500 1727096240.15317: variable 'ansible_module_compression' from source: unknown 15500 1727096240.15322: variable 'ansible_shell_type' from source: unknown 15500 1727096240.15328: variable 'ansible_shell_executable' from source: unknown 15500 1727096240.15333: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.15339: variable 'ansible_pipelining' from source: unknown 15500 1727096240.15344: variable 'ansible_timeout' from source: unknown 15500 1727096240.15351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.15606: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096240.15972: variable 'omit' from source: magic vars 15500 1727096240.15976: starting attempt loop 15500 1727096240.15979: running the handler 15500 1727096240.15981: handler run complete 15500 1727096240.15983: attempt loop complete, returning result 15500 1727096240.15985: _execute() done 15500 1727096240.15987: dumping result to json 15500 1727096240.15989: done dumping result, returning 15500 1727096240.15991: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-877d-2da0-00000000048a] 15500 1727096240.15993: sending task result for task 0afff68d-5257-877d-2da0-00000000048a 15500 1727096240.16069: done sending task result for task 0afff68d-5257-877d-2da0-00000000048a 15500 1727096240.16073: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15500 1727096240.16128: no more pending results, returning what we have 15500 1727096240.16132: results queue empty 15500 1727096240.16132: checking for any_errors_fatal 15500 1727096240.16134: done checking for any_errors_fatal 15500 1727096240.16135: checking for max_fail_percentage 15500 1727096240.16137: done checking for max_fail_percentage 15500 1727096240.16138: checking to see if all hosts have failed and the running result is not ok 15500 1727096240.16139: done checking to see if all hosts have failed 15500 1727096240.16139: getting the remaining hosts for this loop 15500 1727096240.16141: done getting the remaining hosts for this loop 15500 1727096240.16144: getting the next task for host managed_node1 15500 1727096240.16152: done getting next task for host managed_node1 15500 1727096240.16155: ^ task is: TASK: Stat profile file 15500 1727096240.16159: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096240.16163: getting variables 15500 1727096240.16165: in VariableManager get_vars() 15500 1727096240.16196: Calling all_inventory to load vars for managed_node1 15500 1727096240.16199: Calling groups_inventory to load vars for managed_node1 15500 1727096240.16203: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096240.16216: Calling all_plugins_play to load vars for managed_node1 15500 1727096240.16219: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096240.16222: Calling groups_plugins_play to load vars for managed_node1 15500 1727096240.18811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096240.21572: done with get_vars() 15500 1727096240.21607: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:57:20 -0400 (0:00:00.105) 0:00:40.260 ****** 15500 1727096240.21711: entering _queue_task() for managed_node1/stat 15500 1727096240.22080: worker is 1 (out of 1 available) 15500 1727096240.22093: exiting _queue_task() for managed_node1/stat 15500 1727096240.22109: done queuing things up, now waiting for results queue to drain 15500 1727096240.22110: waiting for pending results... 15500 1727096240.22399: running TaskExecutor() for managed_node1/TASK: Stat profile file 15500 1727096240.22539: in run() - task 0afff68d-5257-877d-2da0-00000000048b 15500 1727096240.22556: variable 'ansible_search_path' from source: unknown 15500 1727096240.22563: variable 'ansible_search_path' from source: unknown 15500 1727096240.22611: calling self._execute() 15500 1727096240.22714: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.22724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.22738: variable 'omit' from source: magic vars 15500 1727096240.23132: variable 'ansible_distribution_major_version' from source: facts 15500 1727096240.23149: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096240.23160: variable 'omit' from source: magic vars 15500 1727096240.23212: variable 'omit' from source: magic vars 15500 1727096240.23317: variable 'profile' from source: play vars 15500 1727096240.23327: variable 'interface' from source: set_fact 15500 1727096240.23400: variable 'interface' from source: set_fact 15500 1727096240.23424: variable 'omit' from source: magic vars 15500 1727096240.23477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096240.23518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096240.23544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096240.23577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096240.23593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096240.23675: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096240.23678: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.23682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.23753: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096240.23764: Set connection var ansible_pipelining to False 15500 1727096240.23784: Set connection var ansible_timeout to 10 15500 1727096240.23793: Set connection var ansible_shell_type to sh 15500 1727096240.23804: Set connection var ansible_shell_executable to /bin/sh 15500 1727096240.23814: Set connection var ansible_connection to ssh 15500 1727096240.23891: variable 'ansible_shell_executable' from source: unknown 15500 1727096240.23895: variable 'ansible_connection' from source: unknown 15500 1727096240.23897: variable 'ansible_module_compression' from source: unknown 15500 1727096240.23900: variable 'ansible_shell_type' from source: unknown 15500 1727096240.23902: variable 'ansible_shell_executable' from source: unknown 15500 1727096240.23905: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.23907: variable 'ansible_pipelining' from source: unknown 15500 1727096240.23909: variable 'ansible_timeout' from source: unknown 15500 1727096240.23911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.24095: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096240.24117: variable 'omit' from source: magic vars 15500 1727096240.24129: starting attempt loop 15500 1727096240.24136: running the handler 15500 1727096240.24217: _low_level_execute_command(): starting 15500 1727096240.24221: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096240.24989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.25390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.25776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.27501: stdout chunk (state=3): >>>/root <<< 15500 1727096240.27757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096240.27760: stdout chunk (state=3): >>><<< 15500 1727096240.27763: stderr chunk (state=3): >>><<< 15500 1727096240.27766: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096240.27772: _low_level_execute_command(): starting 15500 1727096240.27777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738 `" && echo ansible-tmp-1727096240.2766504-17155-256926345431738="` echo /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738 `" ) && sleep 0' 15500 1727096240.28969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.29200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.29277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.31298: stdout chunk (state=3): >>>ansible-tmp-1727096240.2766504-17155-256926345431738=/root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738 <<< 15500 1727096240.31422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096240.31452: stderr chunk (state=3): >>><<< 15500 1727096240.31455: stdout chunk (state=3): >>><<< 15500 1727096240.31489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096240.2766504-17155-256926345431738=/root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096240.31544: variable 'ansible_module_compression' from source: unknown 15500 1727096240.31611: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15500 1727096240.31772: variable 'ansible_facts' from source: unknown 15500 1727096240.31975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/AnsiballZ_stat.py 15500 1727096240.32197: Sending initial data 15500 1727096240.32201: Sent initial data (153 bytes) 15500 1727096240.33302: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096240.33317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15500 1727096240.33329: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096240.33673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.33723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.35469: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096240.35484: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096240.35751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpgolz691g /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/AnsiballZ_stat.py <<< 15500 1727096240.35754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/AnsiballZ_stat.py" <<< 15500 1727096240.35795: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpgolz691g" to remote "/root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/AnsiballZ_stat.py" <<< 15500 1727096240.36945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096240.36999: stderr chunk (state=3): >>><<< 15500 1727096240.37008: stdout chunk (state=3): >>><<< 15500 1727096240.37057: done transferring module to remote 15500 1727096240.37235: _low_level_execute_command(): starting 15500 1727096240.37239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/ /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/AnsiballZ_stat.py && sleep 0' 15500 1727096240.38440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096240.38453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.38465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.38557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.40644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096240.40648: stdout chunk (state=3): >>><<< 15500 1727096240.40772: stderr chunk (state=3): >>><<< 15500 1727096240.40776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096240.40783: _low_level_execute_command(): starting 15500 1727096240.40786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/AnsiballZ_stat.py && sleep 0' 15500 1727096240.41799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096240.41980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096240.42026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.42126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.42223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.57861: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15500 1727096240.59308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096240.59320: stdout chunk (state=3): >>><<< 15500 1727096240.59334: stderr chunk (state=3): >>><<< 15500 1727096240.59357: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096240.59685: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096240.59689: _low_level_execute_command(): starting 15500 1727096240.59692: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096240.2766504-17155-256926345431738/ > /dev/null 2>&1 && sleep 0' 15500 1727096240.60879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096240.60993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096240.61014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096240.61132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096240.61240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.61257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.61371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.63355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096240.63370: stdout chunk (state=3): >>><<< 15500 1727096240.63384: stderr chunk (state=3): >>><<< 15500 1727096240.63406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096240.63775: handler run complete 15500 1727096240.63779: attempt loop complete, returning result 15500 1727096240.63782: _execute() done 15500 1727096240.63785: dumping result to json 15500 1727096240.63787: done dumping result, returning 15500 1727096240.63794: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0afff68d-5257-877d-2da0-00000000048b] 15500 1727096240.63796: sending task result for task 0afff68d-5257-877d-2da0-00000000048b 15500 1727096240.63871: done sending task result for task 0afff68d-5257-877d-2da0-00000000048b 15500 1727096240.63875: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15500 1727096240.63950: no more pending results, returning what we have 15500 1727096240.63955: results queue empty 15500 1727096240.63956: checking for any_errors_fatal 15500 1727096240.63966: done checking for any_errors_fatal 15500 1727096240.63969: checking for max_fail_percentage 15500 1727096240.63971: done checking for max_fail_percentage 15500 1727096240.63972: checking to see if all hosts have failed and the running result is not ok 15500 1727096240.63973: done checking to see if all hosts have failed 15500 1727096240.63974: getting the remaining hosts for this loop 15500 1727096240.63976: done getting the remaining hosts for this loop 15500 1727096240.63980: getting the next task for host managed_node1 15500 1727096240.63990: done getting next task for host managed_node1 15500 1727096240.63993: ^ task is: TASK: Set NM profile exist flag based on the profile files 15500 1727096240.63997: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096240.64001: getting variables 15500 1727096240.64003: in VariableManager get_vars() 15500 1727096240.64035: Calling all_inventory to load vars for managed_node1 15500 1727096240.64038: Calling groups_inventory to load vars for managed_node1 15500 1727096240.64042: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096240.64176: Calling all_plugins_play to load vars for managed_node1 15500 1727096240.64180: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096240.64184: Calling groups_plugins_play to load vars for managed_node1 15500 1727096240.67464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096240.71408: done with get_vars() 15500 1727096240.71441: done getting variables 15500 1727096240.71707: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:57:20 -0400 (0:00:00.500) 0:00:40.760 ****** 15500 1727096240.71740: entering _queue_task() for managed_node1/set_fact 15500 1727096240.72509: worker is 1 (out of 1 available) 15500 1727096240.72522: exiting _queue_task() for managed_node1/set_fact 15500 1727096240.72534: done queuing things up, now waiting for results queue to drain 15500 1727096240.72536: waiting for pending results... 15500 1727096240.72754: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 15500 1727096240.73374: in run() - task 0afff68d-5257-877d-2da0-00000000048c 15500 1727096240.73378: variable 'ansible_search_path' from source: unknown 15500 1727096240.73380: variable 'ansible_search_path' from source: unknown 15500 1727096240.73383: calling self._execute() 15500 1727096240.73385: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.73388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.73390: variable 'omit' from source: magic vars 15500 1727096240.74108: variable 'ansible_distribution_major_version' from source: facts 15500 1727096240.74128: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096240.74502: variable 'profile_stat' from source: set_fact 15500 1727096240.74674: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096240.74678: when evaluation is False, skipping this task 15500 1727096240.74680: _execute() done 15500 1727096240.74683: dumping result to json 15500 1727096240.74685: done dumping result, returning 15500 1727096240.74688: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-877d-2da0-00000000048c] 15500 1727096240.74690: sending task result for task 0afff68d-5257-877d-2da0-00000000048c 15500 1727096240.74761: done sending task result for task 0afff68d-5257-877d-2da0-00000000048c 15500 1727096240.74765: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096240.74813: no more pending results, returning what we have 15500 1727096240.74818: results queue empty 15500 1727096240.74818: checking for any_errors_fatal 15500 1727096240.74830: done checking for any_errors_fatal 15500 1727096240.74831: checking for max_fail_percentage 15500 1727096240.74832: done checking for max_fail_percentage 15500 1727096240.74834: checking to see if all hosts have failed and the running result is not ok 15500 1727096240.74834: done checking to see if all hosts have failed 15500 1727096240.74835: getting the remaining hosts for this loop 15500 1727096240.74837: done getting the remaining hosts for this loop 15500 1727096240.74840: getting the next task for host managed_node1 15500 1727096240.74849: done getting next task for host managed_node1 15500 1727096240.74852: ^ task is: TASK: Get NM profile info 15500 1727096240.74856: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096240.74863: getting variables 15500 1727096240.74865: in VariableManager get_vars() 15500 1727096240.74900: Calling all_inventory to load vars for managed_node1 15500 1727096240.74903: Calling groups_inventory to load vars for managed_node1 15500 1727096240.74907: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096240.74919: Calling all_plugins_play to load vars for managed_node1 15500 1727096240.74922: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096240.74925: Calling groups_plugins_play to load vars for managed_node1 15500 1727096240.78369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096240.81860: done with get_vars() 15500 1727096240.82254: done getting variables 15500 1727096240.82326: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:57:20 -0400 (0:00:00.106) 0:00:40.866 ****** 15500 1727096240.82364: entering _queue_task() for managed_node1/shell 15500 1727096240.83134: worker is 1 (out of 1 available) 15500 1727096240.83148: exiting _queue_task() for managed_node1/shell 15500 1727096240.83164: done queuing things up, now waiting for results queue to drain 15500 1727096240.83165: waiting for pending results... 15500 1727096240.83990: running TaskExecutor() for managed_node1/TASK: Get NM profile info 15500 1727096240.84145: in run() - task 0afff68d-5257-877d-2da0-00000000048d 15500 1727096240.84322: variable 'ansible_search_path' from source: unknown 15500 1727096240.84520: variable 'ansible_search_path' from source: unknown 15500 1727096240.84530: calling self._execute() 15500 1727096240.84860: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.84885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.84901: variable 'omit' from source: magic vars 15500 1727096240.85877: variable 'ansible_distribution_major_version' from source: facts 15500 1727096240.85956: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096240.85971: variable 'omit' from source: magic vars 15500 1727096240.86074: variable 'omit' from source: magic vars 15500 1727096240.86474: variable 'profile' from source: play vars 15500 1727096240.86480: variable 'interface' from source: set_fact 15500 1727096240.86483: variable 'interface' from source: set_fact 15500 1727096240.86485: variable 'omit' from source: magic vars 15500 1727096240.86620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096240.86819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096240.86823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096240.86825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096240.86826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096240.86937: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096240.86974: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.86977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.87276: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096240.87280: Set connection var ansible_pipelining to False 15500 1727096240.87283: Set connection var ansible_timeout to 10 15500 1727096240.87285: Set connection var ansible_shell_type to sh 15500 1727096240.87287: Set connection var ansible_shell_executable to /bin/sh 15500 1727096240.87290: Set connection var ansible_connection to ssh 15500 1727096240.87293: variable 'ansible_shell_executable' from source: unknown 15500 1727096240.87296: variable 'ansible_connection' from source: unknown 15500 1727096240.87299: variable 'ansible_module_compression' from source: unknown 15500 1727096240.87302: variable 'ansible_shell_type' from source: unknown 15500 1727096240.87312: variable 'ansible_shell_executable' from source: unknown 15500 1727096240.87319: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096240.87327: variable 'ansible_pipelining' from source: unknown 15500 1727096240.87382: variable 'ansible_timeout' from source: unknown 15500 1727096240.87397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096240.87680: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096240.87696: variable 'omit' from source: magic vars 15500 1727096240.87724: starting attempt loop 15500 1727096240.87731: running the handler 15500 1727096240.87827: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096240.87831: _low_level_execute_command(): starting 15500 1727096240.87833: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096240.89725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.89948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.90258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.91848: stdout chunk (state=3): >>>/root <<< 15500 1727096240.92060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096240.92072: stdout chunk (state=3): >>><<< 15500 1727096240.92082: stderr chunk (state=3): >>><<< 15500 1727096240.92111: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096240.92122: _low_level_execute_command(): starting 15500 1727096240.92129: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211 `" && echo ansible-tmp-1727096240.9210742-17176-259090247965211="` echo /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211 `" ) && sleep 0' 15500 1727096240.93389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096240.93423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096240.93436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.93504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.93640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096240.95555: stdout chunk (state=3): >>>ansible-tmp-1727096240.9210742-17176-259090247965211=/root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211 <<< 15500 1727096240.95755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096240.95759: stdout chunk (state=3): >>><<< 15500 1727096240.95810: stderr chunk (state=3): >>><<< 15500 1727096240.95815: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096240.9210742-17176-259090247965211=/root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096240.95825: variable 'ansible_module_compression' from source: unknown 15500 1727096240.96103: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15500 1727096240.96141: variable 'ansible_facts' from source: unknown 15500 1727096240.96248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/AnsiballZ_command.py 15500 1727096240.96680: Sending initial data 15500 1727096240.96684: Sent initial data (156 bytes) 15500 1727096240.97823: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096240.97833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096240.97849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096240.97982: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096240.98022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096240.98153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096240.98200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096240.98349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.00076: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096241.00106: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096241.00231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096241.00335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpl17n3vbl /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/AnsiballZ_command.py <<< 15500 1727096241.00338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/AnsiballZ_command.py" <<< 15500 1727096241.00417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpl17n3vbl" to remote "/root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/AnsiballZ_command.py" <<< 15500 1727096241.01229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096241.01473: stderr chunk (state=3): >>><<< 15500 1727096241.01477: stdout chunk (state=3): >>><<< 15500 1727096241.01480: done transferring module to remote 15500 1727096241.01482: _low_level_execute_command(): starting 15500 1727096241.01485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/ /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/AnsiballZ_command.py && sleep 0' 15500 1727096241.02020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096241.02040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096241.02087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.02166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096241.02321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.04290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096241.04294: stdout chunk (state=3): >>><<< 15500 1727096241.04299: stderr chunk (state=3): >>><<< 15500 1727096241.04320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096241.04323: _low_level_execute_command(): starting 15500 1727096241.04329: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/AnsiballZ_command.py && sleep 0' 15500 1727096241.04933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096241.04972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096241.04982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096241.05012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096241.05025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096241.05050: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096241.05063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.05079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096241.05088: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096241.05095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15500 1727096241.05103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096241.05121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096241.05246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096241.05515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096241.05699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.23251: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-23 08:57:21.214888", "end": "2024-09-23 08:57:21.231016", "delta": "0:00:00.016128", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15500 1727096241.25019: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. <<< 15500 1727096241.25024: stdout chunk (state=3): >>><<< 15500 1727096241.25026: stderr chunk (state=3): >>><<< 15500 1727096241.25029: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-23 08:57:21.214888", "end": "2024-09-23 08:57:21.231016", "delta": "0:00:00.016128", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.125 closed. 15500 1727096241.25032: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096241.25035: _low_level_execute_command(): starting 15500 1727096241.25037: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096240.9210742-17176-259090247965211/ > /dev/null 2>&1 && sleep 0' 15500 1727096241.25898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096241.25902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.25910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096241.25912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.25971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096241.25975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096241.25997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096241.26099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.28165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096241.28182: stdout chunk (state=3): >>><<< 15500 1727096241.28194: stderr chunk (state=3): >>><<< 15500 1727096241.28575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096241.28579: handler run complete 15500 1727096241.28581: Evaluated conditional (False): False 15500 1727096241.28583: attempt loop complete, returning result 15500 1727096241.28585: _execute() done 15500 1727096241.28587: dumping result to json 15500 1727096241.28588: done dumping result, returning 15500 1727096241.28590: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0afff68d-5257-877d-2da0-00000000048d] 15500 1727096241.28592: sending task result for task 0afff68d-5257-877d-2da0-00000000048d fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.016128", "end": "2024-09-23 08:57:21.231016", "rc": 1, "start": "2024-09-23 08:57:21.214888" } MSG: non-zero return code ...ignoring 15500 1727096241.28744: no more pending results, returning what we have 15500 1727096241.28748: results queue empty 15500 1727096241.28748: checking for any_errors_fatal 15500 1727096241.28755: done checking for any_errors_fatal 15500 1727096241.28756: checking for max_fail_percentage 15500 1727096241.28764: done checking for max_fail_percentage 15500 1727096241.28766: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.28769: done checking to see if all hosts have failed 15500 1727096241.28969: getting the remaining hosts for this loop 15500 1727096241.28973: done getting the remaining hosts for this loop 15500 1727096241.28977: getting the next task for host managed_node1 15500 1727096241.28984: done getting next task for host managed_node1 15500 1727096241.28986: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15500 1727096241.28990: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.28994: getting variables 15500 1727096241.28996: in VariableManager get_vars() 15500 1727096241.29027: Calling all_inventory to load vars for managed_node1 15500 1727096241.29030: Calling groups_inventory to load vars for managed_node1 15500 1727096241.29033: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.29046: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.29048: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.29051: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.29779: done sending task result for task 0afff68d-5257-877d-2da0-00000000048d 15500 1727096241.29784: WORKER PROCESS EXITING 15500 1727096241.31420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.33415: done with get_vars() 15500 1727096241.33453: done getting variables 15500 1727096241.33627: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:57:21 -0400 (0:00:00.512) 0:00:41.379 ****** 15500 1727096241.33664: entering _queue_task() for managed_node1/set_fact 15500 1727096241.34541: worker is 1 (out of 1 available) 15500 1727096241.34555: exiting _queue_task() for managed_node1/set_fact 15500 1727096241.34573: done queuing things up, now waiting for results queue to drain 15500 1727096241.34577: waiting for pending results... 15500 1727096241.34803: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15500 1727096241.34887: in run() - task 0afff68d-5257-877d-2da0-00000000048e 15500 1727096241.34899: variable 'ansible_search_path' from source: unknown 15500 1727096241.34902: variable 'ansible_search_path' from source: unknown 15500 1727096241.34935: calling self._execute() 15500 1727096241.35011: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.35015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.35026: variable 'omit' from source: magic vars 15500 1727096241.35356: variable 'ansible_distribution_major_version' from source: facts 15500 1727096241.35363: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096241.35415: variable 'nm_profile_exists' from source: set_fact 15500 1727096241.35429: Evaluated conditional (nm_profile_exists.rc == 0): False 15500 1727096241.35433: when evaluation is False, skipping this task 15500 1727096241.35436: _execute() done 15500 1727096241.35439: dumping result to json 15500 1727096241.35441: done dumping result, returning 15500 1727096241.35448: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-877d-2da0-00000000048e] 15500 1727096241.35453: sending task result for task 0afff68d-5257-877d-2da0-00000000048e 15500 1727096241.35541: done sending task result for task 0afff68d-5257-877d-2da0-00000000048e 15500 1727096241.35544: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15500 1727096241.35611: no more pending results, returning what we have 15500 1727096241.35615: results queue empty 15500 1727096241.35616: checking for any_errors_fatal 15500 1727096241.35626: done checking for any_errors_fatal 15500 1727096241.35626: checking for max_fail_percentage 15500 1727096241.35628: done checking for max_fail_percentage 15500 1727096241.35629: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.35630: done checking to see if all hosts have failed 15500 1727096241.35631: getting the remaining hosts for this loop 15500 1727096241.35632: done getting the remaining hosts for this loop 15500 1727096241.35636: getting the next task for host managed_node1 15500 1727096241.35646: done getting next task for host managed_node1 15500 1727096241.35648: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15500 1727096241.35652: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.35660: getting variables 15500 1727096241.35662: in VariableManager get_vars() 15500 1727096241.35691: Calling all_inventory to load vars for managed_node1 15500 1727096241.35694: Calling groups_inventory to load vars for managed_node1 15500 1727096241.35696: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.35706: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.35709: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.35711: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.36913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.39889: done with get_vars() 15500 1727096241.39934: done getting variables 15500 1727096241.39998: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096241.40102: variable 'profile' from source: play vars 15500 1727096241.40105: variable 'interface' from source: set_fact 15500 1727096241.40170: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:57:21 -0400 (0:00:00.065) 0:00:41.445 ****** 15500 1727096241.40195: entering _queue_task() for managed_node1/command 15500 1727096241.40610: worker is 1 (out of 1 available) 15500 1727096241.40626: exiting _queue_task() for managed_node1/command 15500 1727096241.40639: done queuing things up, now waiting for results queue to drain 15500 1727096241.40640: waiting for pending results... 15500 1727096241.40875: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15500 1727096241.41037: in run() - task 0afff68d-5257-877d-2da0-000000000490 15500 1727096241.41041: variable 'ansible_search_path' from source: unknown 15500 1727096241.41044: variable 'ansible_search_path' from source: unknown 15500 1727096241.41058: calling self._execute() 15500 1727096241.41159: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.41191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.41195: variable 'omit' from source: magic vars 15500 1727096241.41533: variable 'ansible_distribution_major_version' from source: facts 15500 1727096241.41780: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096241.41784: variable 'profile_stat' from source: set_fact 15500 1727096241.41786: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096241.41789: when evaluation is False, skipping this task 15500 1727096241.41791: _execute() done 15500 1727096241.41794: dumping result to json 15500 1727096241.41796: done dumping result, returning 15500 1727096241.41798: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000490] 15500 1727096241.41800: sending task result for task 0afff68d-5257-877d-2da0-000000000490 15500 1727096241.41872: done sending task result for task 0afff68d-5257-877d-2da0-000000000490 15500 1727096241.41875: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096241.41936: no more pending results, returning what we have 15500 1727096241.41941: results queue empty 15500 1727096241.41942: checking for any_errors_fatal 15500 1727096241.41950: done checking for any_errors_fatal 15500 1727096241.41951: checking for max_fail_percentage 15500 1727096241.41952: done checking for max_fail_percentage 15500 1727096241.41953: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.41954: done checking to see if all hosts have failed 15500 1727096241.41955: getting the remaining hosts for this loop 15500 1727096241.41957: done getting the remaining hosts for this loop 15500 1727096241.41961: getting the next task for host managed_node1 15500 1727096241.41972: done getting next task for host managed_node1 15500 1727096241.41975: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15500 1727096241.41979: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.41985: getting variables 15500 1727096241.41991: in VariableManager get_vars() 15500 1727096241.42023: Calling all_inventory to load vars for managed_node1 15500 1727096241.42026: Calling groups_inventory to load vars for managed_node1 15500 1727096241.42030: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.42042: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.42046: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.42050: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.43376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.44275: done with get_vars() 15500 1727096241.44295: done getting variables 15500 1727096241.44340: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096241.44429: variable 'profile' from source: play vars 15500 1727096241.44432: variable 'interface' from source: set_fact 15500 1727096241.44477: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:57:21 -0400 (0:00:00.043) 0:00:41.488 ****** 15500 1727096241.44501: entering _queue_task() for managed_node1/set_fact 15500 1727096241.44769: worker is 1 (out of 1 available) 15500 1727096241.44784: exiting _queue_task() for managed_node1/set_fact 15500 1727096241.44795: done queuing things up, now waiting for results queue to drain 15500 1727096241.44797: waiting for pending results... 15500 1727096241.45126: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15500 1727096241.45131: in run() - task 0afff68d-5257-877d-2da0-000000000491 15500 1727096241.45160: variable 'ansible_search_path' from source: unknown 15500 1727096241.45165: variable 'ansible_search_path' from source: unknown 15500 1727096241.45340: calling self._execute() 15500 1727096241.45524: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.45561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.45580: variable 'omit' from source: magic vars 15500 1727096241.46166: variable 'ansible_distribution_major_version' from source: facts 15500 1727096241.46172: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096241.46282: variable 'profile_stat' from source: set_fact 15500 1727096241.46293: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096241.46296: when evaluation is False, skipping this task 15500 1727096241.46300: _execute() done 15500 1727096241.46302: dumping result to json 15500 1727096241.46305: done dumping result, returning 15500 1727096241.46314: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000491] 15500 1727096241.46316: sending task result for task 0afff68d-5257-877d-2da0-000000000491 15500 1727096241.46562: done sending task result for task 0afff68d-5257-877d-2da0-000000000491 15500 1727096241.46565: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096241.46632: no more pending results, returning what we have 15500 1727096241.46638: results queue empty 15500 1727096241.46639: checking for any_errors_fatal 15500 1727096241.46648: done checking for any_errors_fatal 15500 1727096241.46650: checking for max_fail_percentage 15500 1727096241.46652: done checking for max_fail_percentage 15500 1727096241.46653: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.46654: done checking to see if all hosts have failed 15500 1727096241.46654: getting the remaining hosts for this loop 15500 1727096241.46656: done getting the remaining hosts for this loop 15500 1727096241.46664: getting the next task for host managed_node1 15500 1727096241.46675: done getting next task for host managed_node1 15500 1727096241.46677: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15500 1727096241.46681: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.46688: getting variables 15500 1727096241.46690: in VariableManager get_vars() 15500 1727096241.46723: Calling all_inventory to load vars for managed_node1 15500 1727096241.46726: Calling groups_inventory to load vars for managed_node1 15500 1727096241.46730: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.46740: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.46742: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.46744: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.48571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.49943: done with get_vars() 15500 1727096241.49968: done getting variables 15500 1727096241.50014: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096241.50108: variable 'profile' from source: play vars 15500 1727096241.50111: variable 'interface' from source: set_fact 15500 1727096241.50168: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:57:21 -0400 (0:00:00.056) 0:00:41.545 ****** 15500 1727096241.50193: entering _queue_task() for managed_node1/command 15500 1727096241.50526: worker is 1 (out of 1 available) 15500 1727096241.50541: exiting _queue_task() for managed_node1/command 15500 1727096241.50555: done queuing things up, now waiting for results queue to drain 15500 1727096241.50556: waiting for pending results... 15500 1727096241.50825: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15500 1727096241.50987: in run() - task 0afff68d-5257-877d-2da0-000000000492 15500 1727096241.51022: variable 'ansible_search_path' from source: unknown 15500 1727096241.51052: variable 'ansible_search_path' from source: unknown 15500 1727096241.51056: calling self._execute() 15500 1727096241.51126: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.51131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.51137: variable 'omit' from source: magic vars 15500 1727096241.51423: variable 'ansible_distribution_major_version' from source: facts 15500 1727096241.51453: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096241.51527: variable 'profile_stat' from source: set_fact 15500 1727096241.51539: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096241.51543: when evaluation is False, skipping this task 15500 1727096241.51546: _execute() done 15500 1727096241.51552: dumping result to json 15500 1727096241.51555: done dumping result, returning 15500 1727096241.51605: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000492] 15500 1727096241.51608: sending task result for task 0afff68d-5257-877d-2da0-000000000492 15500 1727096241.51692: done sending task result for task 0afff68d-5257-877d-2da0-000000000492 15500 1727096241.51695: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096241.51771: no more pending results, returning what we have 15500 1727096241.51776: results queue empty 15500 1727096241.51777: checking for any_errors_fatal 15500 1727096241.51789: done checking for any_errors_fatal 15500 1727096241.51789: checking for max_fail_percentage 15500 1727096241.51791: done checking for max_fail_percentage 15500 1727096241.51792: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.51793: done checking to see if all hosts have failed 15500 1727096241.51793: getting the remaining hosts for this loop 15500 1727096241.51797: done getting the remaining hosts for this loop 15500 1727096241.51805: getting the next task for host managed_node1 15500 1727096241.51814: done getting next task for host managed_node1 15500 1727096241.51816: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15500 1727096241.51821: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.51828: getting variables 15500 1727096241.51829: in VariableManager get_vars() 15500 1727096241.51855: Calling all_inventory to load vars for managed_node1 15500 1727096241.51858: Calling groups_inventory to load vars for managed_node1 15500 1727096241.51861: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.51874: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.51876: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.51880: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.52989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.54572: done with get_vars() 15500 1727096241.54613: done getting variables 15500 1727096241.54701: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096241.54847: variable 'profile' from source: play vars 15500 1727096241.54850: variable 'interface' from source: set_fact 15500 1727096241.54937: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:57:21 -0400 (0:00:00.047) 0:00:41.592 ****** 15500 1727096241.54984: entering _queue_task() for managed_node1/set_fact 15500 1727096241.55393: worker is 1 (out of 1 available) 15500 1727096241.55410: exiting _queue_task() for managed_node1/set_fact 15500 1727096241.55425: done queuing things up, now waiting for results queue to drain 15500 1727096241.55426: waiting for pending results... 15500 1727096241.55826: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15500 1727096241.55946: in run() - task 0afff68d-5257-877d-2da0-000000000493 15500 1727096241.56000: variable 'ansible_search_path' from source: unknown 15500 1727096241.56005: variable 'ansible_search_path' from source: unknown 15500 1727096241.56050: calling self._execute() 15500 1727096241.56136: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.56151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.56175: variable 'omit' from source: magic vars 15500 1727096241.56483: variable 'ansible_distribution_major_version' from source: facts 15500 1727096241.56492: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096241.56595: variable 'profile_stat' from source: set_fact 15500 1727096241.56599: Evaluated conditional (profile_stat.stat.exists): False 15500 1727096241.56601: when evaluation is False, skipping this task 15500 1727096241.56604: _execute() done 15500 1727096241.56606: dumping result to json 15500 1727096241.56609: done dumping result, returning 15500 1727096241.56616: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0afff68d-5257-877d-2da0-000000000493] 15500 1727096241.56621: sending task result for task 0afff68d-5257-877d-2da0-000000000493 15500 1727096241.56720: done sending task result for task 0afff68d-5257-877d-2da0-000000000493 15500 1727096241.56723: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15500 1727096241.56771: no more pending results, returning what we have 15500 1727096241.56774: results queue empty 15500 1727096241.56775: checking for any_errors_fatal 15500 1727096241.56790: done checking for any_errors_fatal 15500 1727096241.56791: checking for max_fail_percentage 15500 1727096241.56793: done checking for max_fail_percentage 15500 1727096241.56794: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.56794: done checking to see if all hosts have failed 15500 1727096241.56795: getting the remaining hosts for this loop 15500 1727096241.56797: done getting the remaining hosts for this loop 15500 1727096241.56800: getting the next task for host managed_node1 15500 1727096241.56809: done getting next task for host managed_node1 15500 1727096241.56811: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15500 1727096241.56814: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.56818: getting variables 15500 1727096241.56820: in VariableManager get_vars() 15500 1727096241.56850: Calling all_inventory to load vars for managed_node1 15500 1727096241.56852: Calling groups_inventory to load vars for managed_node1 15500 1727096241.56856: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.56870: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.56872: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.56875: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.58338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.59532: done with get_vars() 15500 1727096241.59568: done getting variables 15500 1727096241.59645: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096241.59780: variable 'profile' from source: play vars 15500 1727096241.59784: variable 'interface' from source: set_fact 15500 1727096241.59830: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Monday 23 September 2024 08:57:21 -0400 (0:00:00.048) 0:00:41.641 ****** 15500 1727096241.59853: entering _queue_task() for managed_node1/assert 15500 1727096241.60147: worker is 1 (out of 1 available) 15500 1727096241.60167: exiting _queue_task() for managed_node1/assert 15500 1727096241.60185: done queuing things up, now waiting for results queue to drain 15500 1727096241.60186: waiting for pending results... 15500 1727096241.60525: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' 15500 1727096241.60662: in run() - task 0afff68d-5257-877d-2da0-000000000480 15500 1727096241.60666: variable 'ansible_search_path' from source: unknown 15500 1727096241.60672: variable 'ansible_search_path' from source: unknown 15500 1727096241.60718: calling self._execute() 15500 1727096241.60797: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.60827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.60833: variable 'omit' from source: magic vars 15500 1727096241.61275: variable 'ansible_distribution_major_version' from source: facts 15500 1727096241.61279: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096241.61281: variable 'omit' from source: magic vars 15500 1727096241.61319: variable 'omit' from source: magic vars 15500 1727096241.61431: variable 'profile' from source: play vars 15500 1727096241.61435: variable 'interface' from source: set_fact 15500 1727096241.61489: variable 'interface' from source: set_fact 15500 1727096241.61505: variable 'omit' from source: magic vars 15500 1727096241.61552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096241.61651: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096241.61654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096241.61750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096241.61753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096241.61756: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096241.61759: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.61761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.61820: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096241.61824: Set connection var ansible_pipelining to False 15500 1727096241.61829: Set connection var ansible_timeout to 10 15500 1727096241.61832: Set connection var ansible_shell_type to sh 15500 1727096241.61841: Set connection var ansible_shell_executable to /bin/sh 15500 1727096241.61844: Set connection var ansible_connection to ssh 15500 1727096241.61860: variable 'ansible_shell_executable' from source: unknown 15500 1727096241.61865: variable 'ansible_connection' from source: unknown 15500 1727096241.61870: variable 'ansible_module_compression' from source: unknown 15500 1727096241.61872: variable 'ansible_shell_type' from source: unknown 15500 1727096241.61875: variable 'ansible_shell_executable' from source: unknown 15500 1727096241.61877: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.61881: variable 'ansible_pipelining' from source: unknown 15500 1727096241.61884: variable 'ansible_timeout' from source: unknown 15500 1727096241.61889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.62003: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096241.62060: variable 'omit' from source: magic vars 15500 1727096241.62064: starting attempt loop 15500 1727096241.62067: running the handler 15500 1727096241.62183: variable 'lsr_net_profile_exists' from source: set_fact 15500 1727096241.62186: Evaluated conditional (not lsr_net_profile_exists): True 15500 1727096241.62188: handler run complete 15500 1727096241.62190: attempt loop complete, returning result 15500 1727096241.62195: _execute() done 15500 1727096241.62197: dumping result to json 15500 1727096241.62200: done dumping result, returning 15500 1727096241.62202: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' [0afff68d-5257-877d-2da0-000000000480] 15500 1727096241.62222: sending task result for task 0afff68d-5257-877d-2da0-000000000480 15500 1727096241.62319: done sending task result for task 0afff68d-5257-877d-2da0-000000000480 15500 1727096241.62325: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15500 1727096241.62425: no more pending results, returning what we have 15500 1727096241.62428: results queue empty 15500 1727096241.62429: checking for any_errors_fatal 15500 1727096241.62443: done checking for any_errors_fatal 15500 1727096241.62444: checking for max_fail_percentage 15500 1727096241.62446: done checking for max_fail_percentage 15500 1727096241.62447: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.62448: done checking to see if all hosts have failed 15500 1727096241.62448: getting the remaining hosts for this loop 15500 1727096241.62450: done getting the remaining hosts for this loop 15500 1727096241.62454: getting the next task for host managed_node1 15500 1727096241.62465: done getting next task for host managed_node1 15500 1727096241.62468: ^ task is: TASK: meta (flush_handlers) 15500 1727096241.62471: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.62508: getting variables 15500 1727096241.62511: in VariableManager get_vars() 15500 1727096241.62576: Calling all_inventory to load vars for managed_node1 15500 1727096241.62579: Calling groups_inventory to load vars for managed_node1 15500 1727096241.62582: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.62597: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.62599: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.62602: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.64044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.65492: done with get_vars() 15500 1727096241.65517: done getting variables 15500 1727096241.65577: in VariableManager get_vars() 15500 1727096241.65587: Calling all_inventory to load vars for managed_node1 15500 1727096241.65590: Calling groups_inventory to load vars for managed_node1 15500 1727096241.65592: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.65597: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.65599: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.65600: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.66801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.67948: done with get_vars() 15500 1727096241.67983: done queuing things up, now waiting for results queue to drain 15500 1727096241.67989: results queue empty 15500 1727096241.67990: checking for any_errors_fatal 15500 1727096241.67993: done checking for any_errors_fatal 15500 1727096241.67993: checking for max_fail_percentage 15500 1727096241.67995: done checking for max_fail_percentage 15500 1727096241.67995: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.68002: done checking to see if all hosts have failed 15500 1727096241.68003: getting the remaining hosts for this loop 15500 1727096241.68004: done getting the remaining hosts for this loop 15500 1727096241.68006: getting the next task for host managed_node1 15500 1727096241.68010: done getting next task for host managed_node1 15500 1727096241.68011: ^ task is: TASK: meta (flush_handlers) 15500 1727096241.68012: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.68014: getting variables 15500 1727096241.68014: in VariableManager get_vars() 15500 1727096241.68023: Calling all_inventory to load vars for managed_node1 15500 1727096241.68025: Calling groups_inventory to load vars for managed_node1 15500 1727096241.68026: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.68030: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.68032: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.68033: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.68920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.70151: done with get_vars() 15500 1727096241.70180: done getting variables 15500 1727096241.70242: in VariableManager get_vars() 15500 1727096241.70255: Calling all_inventory to load vars for managed_node1 15500 1727096241.70258: Calling groups_inventory to load vars for managed_node1 15500 1727096241.70261: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.70271: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.70275: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.70279: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.71297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.72746: done with get_vars() 15500 1727096241.72788: done queuing things up, now waiting for results queue to drain 15500 1727096241.72790: results queue empty 15500 1727096241.72791: checking for any_errors_fatal 15500 1727096241.72796: done checking for any_errors_fatal 15500 1727096241.72797: checking for max_fail_percentage 15500 1727096241.72798: done checking for max_fail_percentage 15500 1727096241.72799: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.72800: done checking to see if all hosts have failed 15500 1727096241.72800: getting the remaining hosts for this loop 15500 1727096241.72801: done getting the remaining hosts for this loop 15500 1727096241.72804: getting the next task for host managed_node1 15500 1727096241.72807: done getting next task for host managed_node1 15500 1727096241.72808: ^ task is: None 15500 1727096241.72810: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.72811: done queuing things up, now waiting for results queue to drain 15500 1727096241.72812: results queue empty 15500 1727096241.72812: checking for any_errors_fatal 15500 1727096241.72813: done checking for any_errors_fatal 15500 1727096241.72814: checking for max_fail_percentage 15500 1727096241.72814: done checking for max_fail_percentage 15500 1727096241.72815: checking to see if all hosts have failed and the running result is not ok 15500 1727096241.72816: done checking to see if all hosts have failed 15500 1727096241.72817: getting the next task for host managed_node1 15500 1727096241.72819: done getting next task for host managed_node1 15500 1727096241.72819: ^ task is: None 15500 1727096241.72820: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.72862: in VariableManager get_vars() 15500 1727096241.72888: done with get_vars() 15500 1727096241.72895: in VariableManager get_vars() 15500 1727096241.72909: done with get_vars() 15500 1727096241.72914: variable 'omit' from source: magic vars 15500 1727096241.73057: variable 'task' from source: play vars 15500 1727096241.73099: in VariableManager get_vars() 15500 1727096241.73110: done with get_vars() 15500 1727096241.73131: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 15500 1727096241.73378: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096241.73402: getting the remaining hosts for this loop 15500 1727096241.73403: done getting the remaining hosts for this loop 15500 1727096241.73406: getting the next task for host managed_node1 15500 1727096241.73408: done getting next task for host managed_node1 15500 1727096241.73411: ^ task is: TASK: Gathering Facts 15500 1727096241.73412: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096241.73414: getting variables 15500 1727096241.73415: in VariableManager get_vars() 15500 1727096241.73423: Calling all_inventory to load vars for managed_node1 15500 1727096241.73425: Calling groups_inventory to load vars for managed_node1 15500 1727096241.73427: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096241.73432: Calling all_plugins_play to load vars for managed_node1 15500 1727096241.73435: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096241.73438: Calling groups_plugins_play to load vars for managed_node1 15500 1727096241.75213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096241.76284: done with get_vars() 15500 1727096241.76303: done getting variables 15500 1727096241.76337: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Monday 23 September 2024 08:57:21 -0400 (0:00:00.165) 0:00:41.806 ****** 15500 1727096241.76361: entering _queue_task() for managed_node1/gather_facts 15500 1727096241.76623: worker is 1 (out of 1 available) 15500 1727096241.76636: exiting _queue_task() for managed_node1/gather_facts 15500 1727096241.76648: done queuing things up, now waiting for results queue to drain 15500 1727096241.76650: waiting for pending results... 15500 1727096241.76823: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096241.76904: in run() - task 0afff68d-5257-877d-2da0-0000000004c5 15500 1727096241.76915: variable 'ansible_search_path' from source: unknown 15500 1727096241.76944: calling self._execute() 15500 1727096241.77076: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.77079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.77090: variable 'omit' from source: magic vars 15500 1727096241.77483: variable 'ansible_distribution_major_version' from source: facts 15500 1727096241.77500: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096241.77518: variable 'omit' from source: magic vars 15500 1727096241.77552: variable 'omit' from source: magic vars 15500 1727096241.77598: variable 'omit' from source: magic vars 15500 1727096241.77653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096241.77731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096241.77734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096241.77744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096241.77758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096241.77796: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096241.77806: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.77814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.77949: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096241.77952: Set connection var ansible_pipelining to False 15500 1727096241.77954: Set connection var ansible_timeout to 10 15500 1727096241.77956: Set connection var ansible_shell_type to sh 15500 1727096241.77958: Set connection var ansible_shell_executable to /bin/sh 15500 1727096241.77960: Set connection var ansible_connection to ssh 15500 1727096241.77980: variable 'ansible_shell_executable' from source: unknown 15500 1727096241.77988: variable 'ansible_connection' from source: unknown 15500 1727096241.77995: variable 'ansible_module_compression' from source: unknown 15500 1727096241.78058: variable 'ansible_shell_type' from source: unknown 15500 1727096241.78061: variable 'ansible_shell_executable' from source: unknown 15500 1727096241.78063: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096241.78065: variable 'ansible_pipelining' from source: unknown 15500 1727096241.78069: variable 'ansible_timeout' from source: unknown 15500 1727096241.78071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096241.78217: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096241.78233: variable 'omit' from source: magic vars 15500 1727096241.78242: starting attempt loop 15500 1727096241.78247: running the handler 15500 1727096241.78265: variable 'ansible_facts' from source: unknown 15500 1727096241.78294: _low_level_execute_command(): starting 15500 1727096241.78304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096241.79210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.79227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096241.79275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096241.79327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096241.79396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.81139: stdout chunk (state=3): >>>/root <<< 15500 1727096241.81275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096241.81279: stdout chunk (state=3): >>><<< 15500 1727096241.81282: stderr chunk (state=3): >>><<< 15500 1727096241.81308: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096241.81412: _low_level_execute_command(): starting 15500 1727096241.81416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984 `" && echo ansible-tmp-1727096241.8131454-17212-228432176126984="` echo /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984 `" ) && sleep 0' 15500 1727096241.82045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096241.82062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096241.82079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096241.82096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096241.82108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096241.82175: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.82221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096241.82241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096241.82546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096241.82765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.84790: stdout chunk (state=3): >>>ansible-tmp-1727096241.8131454-17212-228432176126984=/root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984 <<< 15500 1727096241.84936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096241.84950: stdout chunk (state=3): >>><<< 15500 1727096241.84979: stderr chunk (state=3): >>><<< 15500 1727096241.85004: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096241.8131454-17212-228432176126984=/root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096241.85043: variable 'ansible_module_compression' from source: unknown 15500 1727096241.85112: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096241.85186: variable 'ansible_facts' from source: unknown 15500 1727096241.85400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/AnsiballZ_setup.py 15500 1727096241.85621: Sending initial data 15500 1727096241.85624: Sent initial data (154 bytes) 15500 1727096241.86266: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096241.86291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096241.86377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.86411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096241.86426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096241.86445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096241.86602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.88262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096241.88322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096241.88437: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp36mdcpy7 /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/AnsiballZ_setup.py <<< 15500 1727096241.88440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/AnsiballZ_setup.py" <<< 15500 1727096241.88757: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15500 1727096241.88763: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp36mdcpy7" to remote "/root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/AnsiballZ_setup.py" <<< 15500 1727096241.88928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/AnsiballZ_setup.py" <<< 15500 1727096241.91463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096241.91470: stdout chunk (state=3): >>><<< 15500 1727096241.91472: stderr chunk (state=3): >>><<< 15500 1727096241.91482: done transferring module to remote 15500 1727096241.91656: _low_level_execute_command(): starting 15500 1727096241.91663: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/ /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/AnsiballZ_setup.py && sleep 0' 15500 1727096241.92797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096241.92888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.93016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096241.93237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096241.93471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096241.95593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096241.95597: stdout chunk (state=3): >>><<< 15500 1727096241.95599: stderr chunk (state=3): >>><<< 15500 1727096241.95601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096241.95604: _low_level_execute_command(): starting 15500 1727096241.95607: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/AnsiballZ_setup.py && sleep 0' 15500 1727096241.97005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096241.97012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.97015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096241.97018: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096241.97020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096241.97035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096242.60606: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.4296875, "5m": 0.32958984375, "15m": 0.158203125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segment<<< 15500 1727096242.60615: stdout chunk (state=3): >>>ation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [<<< 15500 1727096242.60657: stdout chunk (state=3): >>>fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 395, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795561472, "block_size": 4096, "block_total": 65519099, "block_available": 63914932, "block_used": 1604167, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "22", "epoch": "1727096242", "epoch_int": "1727096242", "date": "2024-09-23", "time": "08:57:22", "iso8601_micro": "2024-09-23T12:57:22.602663Z", "iso8601": "2024-09-23T12:57:22Z", "iso8601_basic": "20240923T085722602663", "iso8601_basic_short": "20240923T085722", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096242.62598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096242.62637: stderr chunk (state=3): >>><<< 15500 1727096242.62641: stdout chunk (state=3): >>><<< 15500 1727096242.62680: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.4296875, "5m": 0.32958984375, "15m": 0.158203125}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2964, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 567, "free": 2964}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 395, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795561472, "block_size": 4096, "block_total": 65519099, "block_available": 63914932, "block_used": 1604167, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "22", "epoch": "1727096242", "epoch_int": "1727096242", "date": "2024-09-23", "time": "08:57:22", "iso8601_micro": "2024-09-23T12:57:22.602663Z", "iso8601": "2024-09-23T12:57:22Z", "iso8601_basic": "20240923T085722602663", "iso8601_basic_short": "20240923T085722", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096242.63043: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096242.63047: _low_level_execute_command(): starting 15500 1727096242.63052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096241.8131454-17212-228432176126984/ > /dev/null 2>&1 && sleep 0' 15500 1727096242.63705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096242.63714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096242.63716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096242.63719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096242.63721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096242.63819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096242.63890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096242.65764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096242.65799: stderr chunk (state=3): >>><<< 15500 1727096242.65802: stdout chunk (state=3): >>><<< 15500 1727096242.65821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096242.65828: handler run complete 15500 1727096242.65921: variable 'ansible_facts' from source: unknown 15500 1727096242.65998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.66241: variable 'ansible_facts' from source: unknown 15500 1727096242.66300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.66380: attempt loop complete, returning result 15500 1727096242.66383: _execute() done 15500 1727096242.66385: dumping result to json 15500 1727096242.66404: done dumping result, returning 15500 1727096242.66411: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-0000000004c5] 15500 1727096242.66415: sending task result for task 0afff68d-5257-877d-2da0-0000000004c5 15500 1727096242.66682: done sending task result for task 0afff68d-5257-877d-2da0-0000000004c5 15500 1727096242.66685: WORKER PROCESS EXITING ok: [managed_node1] 15500 1727096242.66910: no more pending results, returning what we have 15500 1727096242.66912: results queue empty 15500 1727096242.66912: checking for any_errors_fatal 15500 1727096242.66913: done checking for any_errors_fatal 15500 1727096242.66914: checking for max_fail_percentage 15500 1727096242.66915: done checking for max_fail_percentage 15500 1727096242.66916: checking to see if all hosts have failed and the running result is not ok 15500 1727096242.66916: done checking to see if all hosts have failed 15500 1727096242.66917: getting the remaining hosts for this loop 15500 1727096242.66918: done getting the remaining hosts for this loop 15500 1727096242.66920: getting the next task for host managed_node1 15500 1727096242.66924: done getting next task for host managed_node1 15500 1727096242.66925: ^ task is: TASK: meta (flush_handlers) 15500 1727096242.66927: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096242.66929: getting variables 15500 1727096242.66930: in VariableManager get_vars() 15500 1727096242.66946: Calling all_inventory to load vars for managed_node1 15500 1727096242.66948: Calling groups_inventory to load vars for managed_node1 15500 1727096242.66950: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096242.66958: Calling all_plugins_play to load vars for managed_node1 15500 1727096242.66960: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096242.66962: Calling groups_plugins_play to load vars for managed_node1 15500 1727096242.68153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.69954: done with get_vars() 15500 1727096242.69982: done getting variables 15500 1727096242.70069: in VariableManager get_vars() 15500 1727096242.70082: Calling all_inventory to load vars for managed_node1 15500 1727096242.70085: Calling groups_inventory to load vars for managed_node1 15500 1727096242.70089: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096242.70095: Calling all_plugins_play to load vars for managed_node1 15500 1727096242.70097: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096242.70100: Calling groups_plugins_play to load vars for managed_node1 15500 1727096242.71031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.72155: done with get_vars() 15500 1727096242.72182: done queuing things up, now waiting for results queue to drain 15500 1727096242.72183: results queue empty 15500 1727096242.72184: checking for any_errors_fatal 15500 1727096242.72186: done checking for any_errors_fatal 15500 1727096242.72187: checking for max_fail_percentage 15500 1727096242.72188: done checking for max_fail_percentage 15500 1727096242.72192: checking to see if all hosts have failed and the running result is not ok 15500 1727096242.72192: done checking to see if all hosts have failed 15500 1727096242.72193: getting the remaining hosts for this loop 15500 1727096242.72193: done getting the remaining hosts for this loop 15500 1727096242.72195: getting the next task for host managed_node1 15500 1727096242.72198: done getting next task for host managed_node1 15500 1727096242.72200: ^ task is: TASK: Include the task '{{ task }}' 15500 1727096242.72201: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096242.72202: getting variables 15500 1727096242.72203: in VariableManager get_vars() 15500 1727096242.72210: Calling all_inventory to load vars for managed_node1 15500 1727096242.72212: Calling groups_inventory to load vars for managed_node1 15500 1727096242.72213: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096242.72218: Calling all_plugins_play to load vars for managed_node1 15500 1727096242.72219: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096242.72221: Calling groups_plugins_play to load vars for managed_node1 15500 1727096242.72991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.74050: done with get_vars() 15500 1727096242.74071: done getting variables 15500 1727096242.74257: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Monday 23 September 2024 08:57:22 -0400 (0:00:00.979) 0:00:42.786 ****** 15500 1727096242.74292: entering _queue_task() for managed_node1/include_tasks 15500 1727096242.74606: worker is 1 (out of 1 available) 15500 1727096242.74617: exiting _queue_task() for managed_node1/include_tasks 15500 1727096242.74632: done queuing things up, now waiting for results queue to drain 15500 1727096242.74636: waiting for pending results... 15500 1727096242.74970: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' 15500 1727096242.74976: in run() - task 0afff68d-5257-877d-2da0-000000000077 15500 1727096242.75029: variable 'ansible_search_path' from source: unknown 15500 1727096242.75034: calling self._execute() 15500 1727096242.75106: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096242.75114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096242.75129: variable 'omit' from source: magic vars 15500 1727096242.75582: variable 'ansible_distribution_major_version' from source: facts 15500 1727096242.75645: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096242.75648: variable 'task' from source: play vars 15500 1727096242.75676: variable 'task' from source: play vars 15500 1727096242.75690: _execute() done 15500 1727096242.75697: dumping result to json 15500 1727096242.75700: done dumping result, returning 15500 1727096242.75703: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' [0afff68d-5257-877d-2da0-000000000077] 15500 1727096242.75705: sending task result for task 0afff68d-5257-877d-2da0-000000000077 15500 1727096242.75797: done sending task result for task 0afff68d-5257-877d-2da0-000000000077 15500 1727096242.75802: WORKER PROCESS EXITING 15500 1727096242.75828: no more pending results, returning what we have 15500 1727096242.75834: in VariableManager get_vars() 15500 1727096242.75870: Calling all_inventory to load vars for managed_node1 15500 1727096242.75873: Calling groups_inventory to load vars for managed_node1 15500 1727096242.75876: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096242.75889: Calling all_plugins_play to load vars for managed_node1 15500 1727096242.75892: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096242.75895: Calling groups_plugins_play to load vars for managed_node1 15500 1727096242.77277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.78931: done with get_vars() 15500 1727096242.78974: variable 'ansible_search_path' from source: unknown 15500 1727096242.78997: we have included files to process 15500 1727096242.79086: generating all_blocks data 15500 1727096242.79089: done generating all_blocks data 15500 1727096242.79090: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15500 1727096242.79091: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15500 1727096242.79095: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15500 1727096242.79286: in VariableManager get_vars() 15500 1727096242.79298: done with get_vars() 15500 1727096242.79410: done processing included file 15500 1727096242.79411: iterating over new_blocks loaded from include file 15500 1727096242.79412: in VariableManager get_vars() 15500 1727096242.79420: done with get_vars() 15500 1727096242.79421: filtering new block on tags 15500 1727096242.79433: done filtering new block on tags 15500 1727096242.79437: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 15500 1727096242.79442: extending task lists for all hosts with included blocks 15500 1727096242.79462: done extending task lists 15500 1727096242.79463: done processing included files 15500 1727096242.79463: results queue empty 15500 1727096242.79464: checking for any_errors_fatal 15500 1727096242.79465: done checking for any_errors_fatal 15500 1727096242.79465: checking for max_fail_percentage 15500 1727096242.79466: done checking for max_fail_percentage 15500 1727096242.79466: checking to see if all hosts have failed and the running result is not ok 15500 1727096242.79468: done checking to see if all hosts have failed 15500 1727096242.79469: getting the remaining hosts for this loop 15500 1727096242.79470: done getting the remaining hosts for this loop 15500 1727096242.79472: getting the next task for host managed_node1 15500 1727096242.79475: done getting next task for host managed_node1 15500 1727096242.79476: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15500 1727096242.79478: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096242.79480: getting variables 15500 1727096242.79481: in VariableManager get_vars() 15500 1727096242.79487: Calling all_inventory to load vars for managed_node1 15500 1727096242.79488: Calling groups_inventory to load vars for managed_node1 15500 1727096242.79490: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096242.79493: Calling all_plugins_play to load vars for managed_node1 15500 1727096242.79495: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096242.79497: Calling groups_plugins_play to load vars for managed_node1 15500 1727096242.85253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.86808: done with get_vars() 15500 1727096242.86836: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Monday 23 September 2024 08:57:22 -0400 (0:00:00.126) 0:00:42.912 ****** 15500 1727096242.86909: entering _queue_task() for managed_node1/include_tasks 15500 1727096242.87341: worker is 1 (out of 1 available) 15500 1727096242.87361: exiting _queue_task() for managed_node1/include_tasks 15500 1727096242.87383: done queuing things up, now waiting for results queue to drain 15500 1727096242.87386: waiting for pending results... 15500 1727096242.87696: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15500 1727096242.87920: in run() - task 0afff68d-5257-877d-2da0-0000000004d6 15500 1727096242.87946: variable 'ansible_search_path' from source: unknown 15500 1727096242.87961: variable 'ansible_search_path' from source: unknown 15500 1727096242.88026: calling self._execute() 15500 1727096242.88265: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096242.88270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096242.88274: variable 'omit' from source: magic vars 15500 1727096242.88749: variable 'ansible_distribution_major_version' from source: facts 15500 1727096242.88773: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096242.88786: _execute() done 15500 1727096242.88797: dumping result to json 15500 1727096242.88973: done dumping result, returning 15500 1727096242.88977: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-877d-2da0-0000000004d6] 15500 1727096242.88979: sending task result for task 0afff68d-5257-877d-2da0-0000000004d6 15500 1727096242.89067: done sending task result for task 0afff68d-5257-877d-2da0-0000000004d6 15500 1727096242.89072: WORKER PROCESS EXITING 15500 1727096242.89105: no more pending results, returning what we have 15500 1727096242.89112: in VariableManager get_vars() 15500 1727096242.89151: Calling all_inventory to load vars for managed_node1 15500 1727096242.89155: Calling groups_inventory to load vars for managed_node1 15500 1727096242.89162: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096242.89179: Calling all_plugins_play to load vars for managed_node1 15500 1727096242.89183: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096242.89186: Calling groups_plugins_play to load vars for managed_node1 15500 1727096242.91393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.93415: done with get_vars() 15500 1727096242.93443: variable 'ansible_search_path' from source: unknown 15500 1727096242.93445: variable 'ansible_search_path' from source: unknown 15500 1727096242.93455: variable 'task' from source: play vars 15500 1727096242.93568: variable 'task' from source: play vars 15500 1727096242.93603: we have included files to process 15500 1727096242.93604: generating all_blocks data 15500 1727096242.93606: done generating all_blocks data 15500 1727096242.93608: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096242.93609: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096242.93611: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15500 1727096242.93791: done processing included file 15500 1727096242.93793: iterating over new_blocks loaded from include file 15500 1727096242.93795: in VariableManager get_vars() 15500 1727096242.93808: done with get_vars() 15500 1727096242.93809: filtering new block on tags 15500 1727096242.93825: done filtering new block on tags 15500 1727096242.93827: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15500 1727096242.93832: extending task lists for all hosts with included blocks 15500 1727096242.94000: done extending task lists 15500 1727096242.94002: done processing included files 15500 1727096242.94003: results queue empty 15500 1727096242.94003: checking for any_errors_fatal 15500 1727096242.94007: done checking for any_errors_fatal 15500 1727096242.94008: checking for max_fail_percentage 15500 1727096242.94009: done checking for max_fail_percentage 15500 1727096242.94010: checking to see if all hosts have failed and the running result is not ok 15500 1727096242.94011: done checking to see if all hosts have failed 15500 1727096242.94011: getting the remaining hosts for this loop 15500 1727096242.94013: done getting the remaining hosts for this loop 15500 1727096242.94015: getting the next task for host managed_node1 15500 1727096242.94019: done getting next task for host managed_node1 15500 1727096242.94022: ^ task is: TASK: Get stat for interface {{ interface }} 15500 1727096242.94024: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096242.94026: getting variables 15500 1727096242.94027: in VariableManager get_vars() 15500 1727096242.94036: Calling all_inventory to load vars for managed_node1 15500 1727096242.94038: Calling groups_inventory to load vars for managed_node1 15500 1727096242.94040: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096242.94045: Calling all_plugins_play to load vars for managed_node1 15500 1727096242.94047: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096242.94050: Calling groups_plugins_play to load vars for managed_node1 15500 1727096242.95275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096242.96714: done with get_vars() 15500 1727096242.96741: done getting variables 15500 1727096242.96873: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:57:22 -0400 (0:00:00.099) 0:00:43.012 ****** 15500 1727096242.96902: entering _queue_task() for managed_node1/stat 15500 1727096242.97254: worker is 1 (out of 1 available) 15500 1727096242.97270: exiting _queue_task() for managed_node1/stat 15500 1727096242.97282: done queuing things up, now waiting for results queue to drain 15500 1727096242.97283: waiting for pending results... 15500 1727096242.97688: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15500 1727096242.97700: in run() - task 0afff68d-5257-877d-2da0-0000000004e1 15500 1727096242.97720: variable 'ansible_search_path' from source: unknown 15500 1727096242.97728: variable 'ansible_search_path' from source: unknown 15500 1727096242.97773: calling self._execute() 15500 1727096242.97875: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096242.97892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096242.97913: variable 'omit' from source: magic vars 15500 1727096242.98292: variable 'ansible_distribution_major_version' from source: facts 15500 1727096242.98309: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096242.98318: variable 'omit' from source: magic vars 15500 1727096242.98374: variable 'omit' from source: magic vars 15500 1727096242.98481: variable 'interface' from source: set_fact 15500 1727096242.98504: variable 'omit' from source: magic vars 15500 1727096242.98553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096242.98599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096242.98623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096242.98644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096242.98668: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096242.98701: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096242.98709: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096242.98717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096242.98831: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096242.98841: Set connection var ansible_pipelining to False 15500 1727096242.98852: Set connection var ansible_timeout to 10 15500 1727096242.98861: Set connection var ansible_shell_type to sh 15500 1727096242.98878: Set connection var ansible_shell_executable to /bin/sh 15500 1727096242.98888: Set connection var ansible_connection to ssh 15500 1727096242.98917: variable 'ansible_shell_executable' from source: unknown 15500 1727096242.98981: variable 'ansible_connection' from source: unknown 15500 1727096242.98984: variable 'ansible_module_compression' from source: unknown 15500 1727096242.98987: variable 'ansible_shell_type' from source: unknown 15500 1727096242.98989: variable 'ansible_shell_executable' from source: unknown 15500 1727096242.98991: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096242.98993: variable 'ansible_pipelining' from source: unknown 15500 1727096242.98995: variable 'ansible_timeout' from source: unknown 15500 1727096242.98997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096242.99172: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15500 1727096242.99191: variable 'omit' from source: magic vars 15500 1727096242.99207: starting attempt loop 15500 1727096242.99214: running the handler 15500 1727096242.99232: _low_level_execute_command(): starting 15500 1727096242.99243: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096243.00074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096243.00102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096243.00117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.00131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.00310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.02044: stdout chunk (state=3): >>>/root <<< 15500 1727096243.02183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.02488: stderr chunk (state=3): >>><<< 15500 1727096243.02492: stdout chunk (state=3): >>><<< 15500 1727096243.02495: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096243.02497: _low_level_execute_command(): starting 15500 1727096243.02501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876 `" && echo ansible-tmp-1727096243.0242136-17253-53616218717876="` echo /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876 `" ) && sleep 0' 15500 1727096243.03966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096243.03972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096243.03975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address <<< 15500 1727096243.03986: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096243.03989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096243.04222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096243.04489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.04514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.04620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.06635: stdout chunk (state=3): >>>ansible-tmp-1727096243.0242136-17253-53616218717876=/root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876 <<< 15500 1727096243.06798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.06802: stdout chunk (state=3): >>><<< 15500 1727096243.06804: stderr chunk (state=3): >>><<< 15500 1727096243.06824: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096243.0242136-17253-53616218717876=/root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096243.06884: variable 'ansible_module_compression' from source: unknown 15500 1727096243.06952: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15500 1727096243.07114: variable 'ansible_facts' from source: unknown 15500 1727096243.07205: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/AnsiballZ_stat.py 15500 1727096243.07591: Sending initial data 15500 1727096243.07599: Sent initial data (152 bytes) 15500 1727096243.09111: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.09217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.10935: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096243.11104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096243.11182: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmph5uz2yro /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/AnsiballZ_stat.py <<< 15500 1727096243.11185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/AnsiballZ_stat.py" <<< 15500 1727096243.11240: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmph5uz2yro" to remote "/root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/AnsiballZ_stat.py" <<< 15500 1727096243.12716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.12719: stdout chunk (state=3): >>><<< 15500 1727096243.12721: stderr chunk (state=3): >>><<< 15500 1727096243.12723: done transferring module to remote 15500 1727096243.12725: _low_level_execute_command(): starting 15500 1727096243.12727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/ /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/AnsiballZ_stat.py && sleep 0' 15500 1727096243.13685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096243.13974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.13993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.14195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.16172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.16185: stdout chunk (state=3): >>><<< 15500 1727096243.16198: stderr chunk (state=3): >>><<< 15500 1727096243.16292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096243.16302: _low_level_execute_command(): starting 15500 1727096243.16312: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/AnsiballZ_stat.py && sleep 0' 15500 1727096243.17712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096243.17810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096243.17937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.17952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.18086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.33655: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15500 1727096243.35086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096243.35098: stdout chunk (state=3): >>><<< 15500 1727096243.35111: stderr chunk (state=3): >>><<< 15500 1727096243.35132: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096243.35171: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096243.35188: _low_level_execute_command(): starting 15500 1727096243.35198: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096243.0242136-17253-53616218717876/ > /dev/null 2>&1 && sleep 0' 15500 1727096243.35984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096243.36008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096243.36377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.36431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.36607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.38549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.38563: stdout chunk (state=3): >>><<< 15500 1727096243.38579: stderr chunk (state=3): >>><<< 15500 1727096243.38608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096243.38624: handler run complete 15500 1727096243.38651: attempt loop complete, returning result 15500 1727096243.38662: _execute() done 15500 1727096243.38671: dumping result to json 15500 1727096243.38682: done dumping result, returning 15500 1727096243.38696: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0afff68d-5257-877d-2da0-0000000004e1] 15500 1727096243.38704: sending task result for task 0afff68d-5257-877d-2da0-0000000004e1 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15500 1727096243.38970: no more pending results, returning what we have 15500 1727096243.38975: results queue empty 15500 1727096243.38976: checking for any_errors_fatal 15500 1727096243.38978: done checking for any_errors_fatal 15500 1727096243.38979: checking for max_fail_percentage 15500 1727096243.38981: done checking for max_fail_percentage 15500 1727096243.38981: checking to see if all hosts have failed and the running result is not ok 15500 1727096243.38982: done checking to see if all hosts have failed 15500 1727096243.38983: getting the remaining hosts for this loop 15500 1727096243.38985: done getting the remaining hosts for this loop 15500 1727096243.38988: getting the next task for host managed_node1 15500 1727096243.38997: done getting next task for host managed_node1 15500 1727096243.39000: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15500 1727096243.39003: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096243.39008: getting variables 15500 1727096243.39010: in VariableManager get_vars() 15500 1727096243.39042: Calling all_inventory to load vars for managed_node1 15500 1727096243.39045: Calling groups_inventory to load vars for managed_node1 15500 1727096243.39048: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096243.39063: Calling all_plugins_play to load vars for managed_node1 15500 1727096243.39066: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096243.39275: Calling groups_plugins_play to load vars for managed_node1 15500 1727096243.39981: done sending task result for task 0afff68d-5257-877d-2da0-0000000004e1 15500 1727096243.39984: WORKER PROCESS EXITING 15500 1727096243.41301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096243.43313: done with get_vars() 15500 1727096243.43335: done getting variables 15500 1727096243.43399: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15500 1727096243.43520: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Monday 23 September 2024 08:57:23 -0400 (0:00:00.466) 0:00:43.478 ****** 15500 1727096243.43551: entering _queue_task() for managed_node1/assert 15500 1727096243.44160: worker is 1 (out of 1 available) 15500 1727096243.44179: exiting _queue_task() for managed_node1/assert 15500 1727096243.44196: done queuing things up, now waiting for results queue to drain 15500 1727096243.44198: waiting for pending results... 15500 1727096243.44503: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15500 1727096243.44695: in run() - task 0afff68d-5257-877d-2da0-0000000004d7 15500 1727096243.44718: variable 'ansible_search_path' from source: unknown 15500 1727096243.44732: variable 'ansible_search_path' from source: unknown 15500 1727096243.44785: calling self._execute() 15500 1727096243.44909: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096243.44920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096243.44934: variable 'omit' from source: magic vars 15500 1727096243.45525: variable 'ansible_distribution_major_version' from source: facts 15500 1727096243.45553: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096243.45573: variable 'omit' from source: magic vars 15500 1727096243.45625: variable 'omit' from source: magic vars 15500 1727096243.45822: variable 'interface' from source: set_fact 15500 1727096243.45825: variable 'omit' from source: magic vars 15500 1727096243.45899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096243.46034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096243.46038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096243.46040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096243.46043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096243.46077: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096243.46088: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096243.46106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096243.46292: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096243.46306: Set connection var ansible_pipelining to False 15500 1727096243.46320: Set connection var ansible_timeout to 10 15500 1727096243.46329: Set connection var ansible_shell_type to sh 15500 1727096243.46340: Set connection var ansible_shell_executable to /bin/sh 15500 1727096243.46355: Set connection var ansible_connection to ssh 15500 1727096243.46409: variable 'ansible_shell_executable' from source: unknown 15500 1727096243.46496: variable 'ansible_connection' from source: unknown 15500 1727096243.46500: variable 'ansible_module_compression' from source: unknown 15500 1727096243.46503: variable 'ansible_shell_type' from source: unknown 15500 1727096243.46505: variable 'ansible_shell_executable' from source: unknown 15500 1727096243.46507: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096243.46509: variable 'ansible_pipelining' from source: unknown 15500 1727096243.46512: variable 'ansible_timeout' from source: unknown 15500 1727096243.46514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096243.46703: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096243.46773: variable 'omit' from source: magic vars 15500 1727096243.46779: starting attempt loop 15500 1727096243.46782: running the handler 15500 1727096243.46991: variable 'interface_stat' from source: set_fact 15500 1727096243.47006: Evaluated conditional (not interface_stat.stat.exists): True 15500 1727096243.47021: handler run complete 15500 1727096243.47047: attempt loop complete, returning result 15500 1727096243.47054: _execute() done 15500 1727096243.47061: dumping result to json 15500 1727096243.47069: done dumping result, returning 15500 1727096243.47081: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0afff68d-5257-877d-2da0-0000000004d7] 15500 1727096243.47091: sending task result for task 0afff68d-5257-877d-2da0-0000000004d7 15500 1727096243.47299: done sending task result for task 0afff68d-5257-877d-2da0-0000000004d7 15500 1727096243.47302: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15500 1727096243.47356: no more pending results, returning what we have 15500 1727096243.47360: results queue empty 15500 1727096243.47361: checking for any_errors_fatal 15500 1727096243.47374: done checking for any_errors_fatal 15500 1727096243.47375: checking for max_fail_percentage 15500 1727096243.47377: done checking for max_fail_percentage 15500 1727096243.47380: checking to see if all hosts have failed and the running result is not ok 15500 1727096243.47381: done checking to see if all hosts have failed 15500 1727096243.47382: getting the remaining hosts for this loop 15500 1727096243.47384: done getting the remaining hosts for this loop 15500 1727096243.47388: getting the next task for host managed_node1 15500 1727096243.47397: done getting next task for host managed_node1 15500 1727096243.47399: ^ task is: TASK: meta (flush_handlers) 15500 1727096243.47401: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096243.47405: getting variables 15500 1727096243.47407: in VariableManager get_vars() 15500 1727096243.47438: Calling all_inventory to load vars for managed_node1 15500 1727096243.47441: Calling groups_inventory to load vars for managed_node1 15500 1727096243.47445: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096243.47456: Calling all_plugins_play to load vars for managed_node1 15500 1727096243.47460: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096243.47465: Calling groups_plugins_play to load vars for managed_node1 15500 1727096243.49460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096243.51115: done with get_vars() 15500 1727096243.51139: done getting variables 15500 1727096243.51218: in VariableManager get_vars() 15500 1727096243.51228: Calling all_inventory to load vars for managed_node1 15500 1727096243.51231: Calling groups_inventory to load vars for managed_node1 15500 1727096243.51233: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096243.51239: Calling all_plugins_play to load vars for managed_node1 15500 1727096243.51241: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096243.51244: Calling groups_plugins_play to load vars for managed_node1 15500 1727096243.52284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096243.53788: done with get_vars() 15500 1727096243.53821: done queuing things up, now waiting for results queue to drain 15500 1727096243.53823: results queue empty 15500 1727096243.53824: checking for any_errors_fatal 15500 1727096243.53827: done checking for any_errors_fatal 15500 1727096243.53829: checking for max_fail_percentage 15500 1727096243.53830: done checking for max_fail_percentage 15500 1727096243.53831: checking to see if all hosts have failed and the running result is not ok 15500 1727096243.53832: done checking to see if all hosts have failed 15500 1727096243.53841: getting the remaining hosts for this loop 15500 1727096243.53842: done getting the remaining hosts for this loop 15500 1727096243.53845: getting the next task for host managed_node1 15500 1727096243.53849: done getting next task for host managed_node1 15500 1727096243.53850: ^ task is: TASK: meta (flush_handlers) 15500 1727096243.53852: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096243.53854: getting variables 15500 1727096243.53855: in VariableManager get_vars() 15500 1727096243.53864: Calling all_inventory to load vars for managed_node1 15500 1727096243.53866: Calling groups_inventory to load vars for managed_node1 15500 1727096243.53870: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096243.53877: Calling all_plugins_play to load vars for managed_node1 15500 1727096243.53879: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096243.53883: Calling groups_plugins_play to load vars for managed_node1 15500 1727096243.55005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096243.56199: done with get_vars() 15500 1727096243.56214: done getting variables 15500 1727096243.56265: in VariableManager get_vars() 15500 1727096243.56274: Calling all_inventory to load vars for managed_node1 15500 1727096243.56275: Calling groups_inventory to load vars for managed_node1 15500 1727096243.56277: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096243.56280: Calling all_plugins_play to load vars for managed_node1 15500 1727096243.56282: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096243.56283: Calling groups_plugins_play to load vars for managed_node1 15500 1727096243.57018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096243.58381: done with get_vars() 15500 1727096243.58409: done queuing things up, now waiting for results queue to drain 15500 1727096243.58412: results queue empty 15500 1727096243.58412: checking for any_errors_fatal 15500 1727096243.58414: done checking for any_errors_fatal 15500 1727096243.58414: checking for max_fail_percentage 15500 1727096243.58415: done checking for max_fail_percentage 15500 1727096243.58416: checking to see if all hosts have failed and the running result is not ok 15500 1727096243.58417: done checking to see if all hosts have failed 15500 1727096243.58418: getting the remaining hosts for this loop 15500 1727096243.58418: done getting the remaining hosts for this loop 15500 1727096243.58421: getting the next task for host managed_node1 15500 1727096243.58424: done getting next task for host managed_node1 15500 1727096243.58424: ^ task is: None 15500 1727096243.58426: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096243.58427: done queuing things up, now waiting for results queue to drain 15500 1727096243.58428: results queue empty 15500 1727096243.58428: checking for any_errors_fatal 15500 1727096243.58429: done checking for any_errors_fatal 15500 1727096243.58430: checking for max_fail_percentage 15500 1727096243.58430: done checking for max_fail_percentage 15500 1727096243.58431: checking to see if all hosts have failed and the running result is not ok 15500 1727096243.58432: done checking to see if all hosts have failed 15500 1727096243.58433: getting the next task for host managed_node1 15500 1727096243.58435: done getting next task for host managed_node1 15500 1727096243.58436: ^ task is: None 15500 1727096243.58437: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096243.58481: in VariableManager get_vars() 15500 1727096243.58496: done with get_vars() 15500 1727096243.58502: in VariableManager get_vars() 15500 1727096243.58510: done with get_vars() 15500 1727096243.58514: variable 'omit' from source: magic vars 15500 1727096243.58545: in VariableManager get_vars() 15500 1727096243.58562: done with get_vars() 15500 1727096243.58586: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15500 1727096243.58854: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15500 1727096243.58880: getting the remaining hosts for this loop 15500 1727096243.58882: done getting the remaining hosts for this loop 15500 1727096243.58884: getting the next task for host managed_node1 15500 1727096243.58887: done getting next task for host managed_node1 15500 1727096243.58889: ^ task is: TASK: Gathering Facts 15500 1727096243.58890: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096243.58892: getting variables 15500 1727096243.58893: in VariableManager get_vars() 15500 1727096243.58901: Calling all_inventory to load vars for managed_node1 15500 1727096243.58903: Calling groups_inventory to load vars for managed_node1 15500 1727096243.58905: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096243.58916: Calling all_plugins_play to load vars for managed_node1 15500 1727096243.58919: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096243.58922: Calling groups_plugins_play to load vars for managed_node1 15500 1727096243.59882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096243.60918: done with get_vars() 15500 1727096243.60946: done getting variables 15500 1727096243.61012: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Monday 23 September 2024 08:57:23 -0400 (0:00:00.174) 0:00:43.653 ****** 15500 1727096243.61049: entering _queue_task() for managed_node1/gather_facts 15500 1727096243.61497: worker is 1 (out of 1 available) 15500 1727096243.61514: exiting _queue_task() for managed_node1/gather_facts 15500 1727096243.61532: done queuing things up, now waiting for results queue to drain 15500 1727096243.61533: waiting for pending results... 15500 1727096243.61879: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15500 1727096243.61966: in run() - task 0afff68d-5257-877d-2da0-0000000004fa 15500 1727096243.61977: variable 'ansible_search_path' from source: unknown 15500 1727096243.62006: calling self._execute() 15500 1727096243.62275: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096243.62281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096243.62285: variable 'omit' from source: magic vars 15500 1727096243.62665: variable 'ansible_distribution_major_version' from source: facts 15500 1727096243.62683: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096243.62696: variable 'omit' from source: magic vars 15500 1727096243.62735: variable 'omit' from source: magic vars 15500 1727096243.62789: variable 'omit' from source: magic vars 15500 1727096243.62850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096243.62909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096243.62946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096243.62984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096243.63005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096243.63072: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096243.63075: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096243.63150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096243.63230: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096243.63248: Set connection var ansible_pipelining to False 15500 1727096243.63274: Set connection var ansible_timeout to 10 15500 1727096243.63282: Set connection var ansible_shell_type to sh 15500 1727096243.63296: Set connection var ansible_shell_executable to /bin/sh 15500 1727096243.63311: Set connection var ansible_connection to ssh 15500 1727096243.63678: variable 'ansible_shell_executable' from source: unknown 15500 1727096243.63682: variable 'ansible_connection' from source: unknown 15500 1727096243.63684: variable 'ansible_module_compression' from source: unknown 15500 1727096243.63690: variable 'ansible_shell_type' from source: unknown 15500 1727096243.63693: variable 'ansible_shell_executable' from source: unknown 15500 1727096243.63695: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096243.63700: variable 'ansible_pipelining' from source: unknown 15500 1727096243.63702: variable 'ansible_timeout' from source: unknown 15500 1727096243.63705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096243.63998: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096243.64001: variable 'omit' from source: magic vars 15500 1727096243.64004: starting attempt loop 15500 1727096243.64007: running the handler 15500 1727096243.64020: variable 'ansible_facts' from source: unknown 15500 1727096243.64074: _low_level_execute_command(): starting 15500 1727096243.64095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096243.65104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096243.65161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096243.65191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.65197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.65292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.67121: stdout chunk (state=3): >>>/root <<< 15500 1727096243.67127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.67175: stderr chunk (state=3): >>><<< 15500 1727096243.67312: stdout chunk (state=3): >>><<< 15500 1727096243.67317: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096243.67320: _low_level_execute_command(): starting 15500 1727096243.67322: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706 `" && echo ansible-tmp-1727096243.6721668-17291-30139670027706="` echo /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706 `" ) && sleep 0' 15500 1727096243.67901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096243.67956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096243.67975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096243.68045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096243.68079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.68189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.70209: stdout chunk (state=3): >>>ansible-tmp-1727096243.6721668-17291-30139670027706=/root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706 <<< 15500 1727096243.70354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.70385: stdout chunk (state=3): >>><<< 15500 1727096243.70389: stderr chunk (state=3): >>><<< 15500 1727096243.70576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096243.6721668-17291-30139670027706=/root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096243.70580: variable 'ansible_module_compression' from source: unknown 15500 1727096243.70583: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15500 1727096243.70585: variable 'ansible_facts' from source: unknown 15500 1727096243.70816: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/AnsiballZ_setup.py 15500 1727096243.71096: Sending initial data 15500 1727096243.71113: Sent initial data (153 bytes) 15500 1727096243.71679: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096243.71694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096243.71710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096243.71782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096243.71824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096243.71841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.71864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.71966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.73588: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096243.73608: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096243.73700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096243.73763: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpe2rprhpp /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/AnsiballZ_setup.py <<< 15500 1727096243.73786: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/AnsiballZ_setup.py" <<< 15500 1727096243.73847: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpe2rprhpp" to remote "/root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/AnsiballZ_setup.py" <<< 15500 1727096243.75540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.75680: stderr chunk (state=3): >>><<< 15500 1727096243.75692: stdout chunk (state=3): >>><<< 15500 1727096243.75784: done transferring module to remote 15500 1727096243.75791: _low_level_execute_command(): starting 15500 1727096243.75794: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/ /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/AnsiballZ_setup.py && sleep 0' 15500 1727096243.77104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096243.77130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096243.77236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096243.77277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.77318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.77385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096243.79356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096243.79360: stdout chunk (state=3): >>><<< 15500 1727096243.79362: stderr chunk (state=3): >>><<< 15500 1727096243.79452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096243.79584: _low_level_execute_command(): starting 15500 1727096243.79588: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/AnsiballZ_setup.py && sleep 0' 15500 1727096243.80882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096243.81085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096243.81145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096244.44783: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2962, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 569, "free": 2962}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 397, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795524608, "block_size": 4096, "block_total": 65519099, "block_available": 63914923, "block_used": 1604176, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.4296875, "5m": 0.32958984375, "15m": 0.158203125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "24", "epoch": "1727096244", "epoch_int": "1727096244", "date": "2024-09-23", "time": "08:57:24", "iso8601_micro": "2024-09-23T12:57:24.407086Z", "iso8601": "2024-09-23T12:57:24Z", "iso8601_basic": "20240923T085724407086", "iso8601_basic_short": "20240923T085724", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address<<< 15500 1727096244.44837: stdout chunk (state=3): >>>": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15500 1727096244.46781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096244.46836: stderr chunk (state=3): >>><<< 15500 1727096244.46840: stdout chunk (state=3): >>><<< 15500 1727096244.46881: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCSodd+GDkKMH+cjKateYxMnFqGIiE1cnsha6Cte8GTIWdeerR6yaWx/72lnuD/E1u9q+QiEGNjwKJ+EVlEbMpe1E41fS12cQMWtucXlJvMJH1Ykit9m7FF02YjLW9EkA9SSgc18cxhBxGHI+sOyXoiqbbp/q8CeDDI6WWQ+a3ZlRONcuaxwv9JBCJY6ZIqEDUsG55fAkwTft0LrM9VfN5D7iCcrlRTAVcjACJKgdvzHt52sVh5wuhCgzisIGrxldgqYAo0PnKgLwHzMWTYpwBJQIwMf6hC7O+PTnljDRXhsY8vTDdVI9W4Nr7mLWJ7W5JZ0lrKajT+FZ1b8usigp6firlnTtqzVmKJ4HJTlBOQ00HmCvVRSFb8Y15UB8Gz/xjJSXiQklJoKt+/wRRazuZALEowaIIdTzQUGv5Yx/BYhcAv8KVvhqbfkvyuwAgI+jzck3nhlUJ3B1WVYs5RdDY1WWEbqEtKyERi7P+cyWg6OhEMmJIpLs1QBW+CU/89w+s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE9Vvl7mU+dsvtGHlOyMt0Nx2BOUkSxor6Ul/pQgHJkP0t5ie5s9TW9uniTzz5NfayEXaMlnOkK/UBl0o7msG9k=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOsOWQBYI+BXY81/x/jv3URNQxi5p8vLDeQfEZVgnjS4", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 40710 10.31.11.125 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 40710 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-125", "ansible_nodename": "ip-10-31-11-125.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2082d1236a0674562ec5e13633e7ec", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2962, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 569, "free": 2962}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_uuid": "ec2082d1-236a-0674-562e-c5e13633e7ec", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 397, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795524608, "block_size": 4096, "block_total": 65519099, "block_available": 63914923, "block_used": 1604176, "inode_total": 131070960, "inode_available": 131029100, "inode_used": 41860, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.4296875, "5m": 0.32958984375, "15m": 0.158203125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "57", "second": "24", "epoch": "1727096244", "epoch_int": "1727096244", "date": "2024-09-23", "time": "08:57:24", "iso8601_micro": "2024-09-23T12:57:24.407086Z", "iso8601": "2024-09-23T12:57:24Z", "iso8601_basic": "20240923T085724407086", "iso8601_basic_short": "20240923T085724", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::10ff:acff:fe3f:90f5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.125", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:ff:ac:3f:90:f5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.125"], "ansible_all_ipv6_addresses": ["fe80::10ff:acff:fe3f:90f5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.125", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::10ff:acff:fe3f:90f5"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096244.47293: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096244.47309: _low_level_execute_command(): starting 15500 1727096244.47312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096243.6721668-17291-30139670027706/ > /dev/null 2>&1 && sleep 0' 15500 1727096244.47896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096244.47900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096244.47975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096244.47978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096244.48038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096244.49921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096244.49962: stderr chunk (state=3): >>><<< 15500 1727096244.49965: stdout chunk (state=3): >>><<< 15500 1727096244.50029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096244.50055: handler run complete 15500 1727096244.50171: variable 'ansible_facts' from source: unknown 15500 1727096244.50314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096244.50672: variable 'ansible_facts' from source: unknown 15500 1727096244.50781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096244.50951: attempt loop complete, returning result 15500 1727096244.50964: _execute() done 15500 1727096244.50974: dumping result to json 15500 1727096244.51010: done dumping result, returning 15500 1727096244.51024: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0afff68d-5257-877d-2da0-0000000004fa] 15500 1727096244.51043: sending task result for task 0afff68d-5257-877d-2da0-0000000004fa ok: [managed_node1] 15500 1727096244.51866: no more pending results, returning what we have 15500 1727096244.51871: results queue empty 15500 1727096244.51872: checking for any_errors_fatal 15500 1727096244.51873: done checking for any_errors_fatal 15500 1727096244.51876: checking for max_fail_percentage 15500 1727096244.51879: done checking for max_fail_percentage 15500 1727096244.51879: checking to see if all hosts have failed and the running result is not ok 15500 1727096244.51880: done checking to see if all hosts have failed 15500 1727096244.51881: getting the remaining hosts for this loop 15500 1727096244.51882: done getting the remaining hosts for this loop 15500 1727096244.51886: getting the next task for host managed_node1 15500 1727096244.51893: done getting next task for host managed_node1 15500 1727096244.51895: ^ task is: TASK: meta (flush_handlers) 15500 1727096244.51897: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096244.51903: getting variables 15500 1727096244.51904: in VariableManager get_vars() 15500 1727096244.51925: Calling all_inventory to load vars for managed_node1 15500 1727096244.51928: Calling groups_inventory to load vars for managed_node1 15500 1727096244.51932: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096244.51945: Calling all_plugins_play to load vars for managed_node1 15500 1727096244.51948: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096244.51954: Calling groups_plugins_play to load vars for managed_node1 15500 1727096244.52514: done sending task result for task 0afff68d-5257-877d-2da0-0000000004fa 15500 1727096244.52806: WORKER PROCESS EXITING 15500 1727096244.52827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096244.53735: done with get_vars() 15500 1727096244.53752: done getting variables 15500 1727096244.53810: in VariableManager get_vars() 15500 1727096244.53818: Calling all_inventory to load vars for managed_node1 15500 1727096244.53819: Calling groups_inventory to load vars for managed_node1 15500 1727096244.53821: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096244.53824: Calling all_plugins_play to load vars for managed_node1 15500 1727096244.53826: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096244.53827: Calling groups_plugins_play to load vars for managed_node1 15500 1727096244.54782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096244.56367: done with get_vars() 15500 1727096244.56395: done queuing things up, now waiting for results queue to drain 15500 1727096244.56397: results queue empty 15500 1727096244.56397: checking for any_errors_fatal 15500 1727096244.56400: done checking for any_errors_fatal 15500 1727096244.56401: checking for max_fail_percentage 15500 1727096244.56401: done checking for max_fail_percentage 15500 1727096244.56402: checking to see if all hosts have failed and the running result is not ok 15500 1727096244.56403: done checking to see if all hosts have failed 15500 1727096244.56407: getting the remaining hosts for this loop 15500 1727096244.56407: done getting the remaining hosts for this loop 15500 1727096244.56410: getting the next task for host managed_node1 15500 1727096244.56412: done getting next task for host managed_node1 15500 1727096244.56414: ^ task is: TASK: Verify network state restored to default 15500 1727096244.56415: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096244.56417: getting variables 15500 1727096244.56418: in VariableManager get_vars() 15500 1727096244.56426: Calling all_inventory to load vars for managed_node1 15500 1727096244.56427: Calling groups_inventory to load vars for managed_node1 15500 1727096244.56429: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096244.56434: Calling all_plugins_play to load vars for managed_node1 15500 1727096244.56435: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096244.56437: Calling groups_plugins_play to load vars for managed_node1 15500 1727096244.57774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096244.58979: done with get_vars() 15500 1727096244.58996: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Monday 23 September 2024 08:57:24 -0400 (0:00:00.980) 0:00:44.633 ****** 15500 1727096244.59052: entering _queue_task() for managed_node1/include_tasks 15500 1727096244.59316: worker is 1 (out of 1 available) 15500 1727096244.59329: exiting _queue_task() for managed_node1/include_tasks 15500 1727096244.59340: done queuing things up, now waiting for results queue to drain 15500 1727096244.59342: waiting for pending results... 15500 1727096244.59510: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 15500 1727096244.59578: in run() - task 0afff68d-5257-877d-2da0-00000000007a 15500 1727096244.59587: variable 'ansible_search_path' from source: unknown 15500 1727096244.59614: calling self._execute() 15500 1727096244.59687: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096244.59692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096244.59703: variable 'omit' from source: magic vars 15500 1727096244.59981: variable 'ansible_distribution_major_version' from source: facts 15500 1727096244.59990: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096244.59997: _execute() done 15500 1727096244.60001: dumping result to json 15500 1727096244.60005: done dumping result, returning 15500 1727096244.60007: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0afff68d-5257-877d-2da0-00000000007a] 15500 1727096244.60018: sending task result for task 0afff68d-5257-877d-2da0-00000000007a 15500 1727096244.60098: done sending task result for task 0afff68d-5257-877d-2da0-00000000007a 15500 1727096244.60101: WORKER PROCESS EXITING 15500 1727096244.60140: no more pending results, returning what we have 15500 1727096244.60144: in VariableManager get_vars() 15500 1727096244.60186: Calling all_inventory to load vars for managed_node1 15500 1727096244.60189: Calling groups_inventory to load vars for managed_node1 15500 1727096244.60192: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096244.60204: Calling all_plugins_play to load vars for managed_node1 15500 1727096244.60207: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096244.60210: Calling groups_plugins_play to load vars for managed_node1 15500 1727096244.61722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096244.63287: done with get_vars() 15500 1727096244.63309: variable 'ansible_search_path' from source: unknown 15500 1727096244.63324: we have included files to process 15500 1727096244.63326: generating all_blocks data 15500 1727096244.63327: done generating all_blocks data 15500 1727096244.63328: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15500 1727096244.63329: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15500 1727096244.63331: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15500 1727096244.63721: done processing included file 15500 1727096244.63723: iterating over new_blocks loaded from include file 15500 1727096244.63724: in VariableManager get_vars() 15500 1727096244.63736: done with get_vars() 15500 1727096244.63737: filtering new block on tags 15500 1727096244.63753: done filtering new block on tags 15500 1727096244.63755: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 15500 1727096244.63759: extending task lists for all hosts with included blocks 15500 1727096244.63791: done extending task lists 15500 1727096244.63792: done processing included files 15500 1727096244.63793: results queue empty 15500 1727096244.63794: checking for any_errors_fatal 15500 1727096244.63795: done checking for any_errors_fatal 15500 1727096244.63796: checking for max_fail_percentage 15500 1727096244.63797: done checking for max_fail_percentage 15500 1727096244.63797: checking to see if all hosts have failed and the running result is not ok 15500 1727096244.63798: done checking to see if all hosts have failed 15500 1727096244.63799: getting the remaining hosts for this loop 15500 1727096244.63800: done getting the remaining hosts for this loop 15500 1727096244.63803: getting the next task for host managed_node1 15500 1727096244.63806: done getting next task for host managed_node1 15500 1727096244.63808: ^ task is: TASK: Check routes and DNS 15500 1727096244.63810: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096244.63812: getting variables 15500 1727096244.63813: in VariableManager get_vars() 15500 1727096244.63821: Calling all_inventory to load vars for managed_node1 15500 1727096244.63823: Calling groups_inventory to load vars for managed_node1 15500 1727096244.63825: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096244.63830: Calling all_plugins_play to load vars for managed_node1 15500 1727096244.63833: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096244.63836: Calling groups_plugins_play to load vars for managed_node1 15500 1727096244.64930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096244.66525: done with get_vars() 15500 1727096244.66548: done getting variables 15500 1727096244.66596: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 08:57:24 -0400 (0:00:00.075) 0:00:44.709 ****** 15500 1727096244.66629: entering _queue_task() for managed_node1/shell 15500 1727096244.66987: worker is 1 (out of 1 available) 15500 1727096244.66999: exiting _queue_task() for managed_node1/shell 15500 1727096244.67012: done queuing things up, now waiting for results queue to drain 15500 1727096244.67013: waiting for pending results... 15500 1727096244.67285: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 15500 1727096244.67389: in run() - task 0afff68d-5257-877d-2da0-00000000050b 15500 1727096244.67407: variable 'ansible_search_path' from source: unknown 15500 1727096244.67475: variable 'ansible_search_path' from source: unknown 15500 1727096244.67479: calling self._execute() 15500 1727096244.67549: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096244.67560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096244.67578: variable 'omit' from source: magic vars 15500 1727096244.67947: variable 'ansible_distribution_major_version' from source: facts 15500 1727096244.68028: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096244.68031: variable 'omit' from source: magic vars 15500 1727096244.68033: variable 'omit' from source: magic vars 15500 1727096244.68058: variable 'omit' from source: magic vars 15500 1727096244.68104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096244.68149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096244.68178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096244.68198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096244.68215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096244.68253: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096244.68261: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096244.68270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096244.68378: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096244.68389: Set connection var ansible_pipelining to False 15500 1727096244.68461: Set connection var ansible_timeout to 10 15500 1727096244.68464: Set connection var ansible_shell_type to sh 15500 1727096244.68466: Set connection var ansible_shell_executable to /bin/sh 15500 1727096244.68470: Set connection var ansible_connection to ssh 15500 1727096244.68472: variable 'ansible_shell_executable' from source: unknown 15500 1727096244.68474: variable 'ansible_connection' from source: unknown 15500 1727096244.68477: variable 'ansible_module_compression' from source: unknown 15500 1727096244.68479: variable 'ansible_shell_type' from source: unknown 15500 1727096244.68481: variable 'ansible_shell_executable' from source: unknown 15500 1727096244.68483: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096244.68484: variable 'ansible_pipelining' from source: unknown 15500 1727096244.68486: variable 'ansible_timeout' from source: unknown 15500 1727096244.68488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096244.68622: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096244.68639: variable 'omit' from source: magic vars 15500 1727096244.68648: starting attempt loop 15500 1727096244.68655: running the handler 15500 1727096244.68674: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096244.68837: _low_level_execute_command(): starting 15500 1727096244.68840: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096244.69432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096244.69450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096244.69465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096244.69490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096244.69511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096244.69521: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096244.69620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096244.69646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096244.69746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096244.71463: stdout chunk (state=3): >>>/root <<< 15500 1727096244.71602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096244.71614: stdout chunk (state=3): >>><<< 15500 1727096244.71627: stderr chunk (state=3): >>><<< 15500 1727096244.71655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096244.71679: _low_level_execute_command(): starting 15500 1727096244.71692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235 `" && echo ansible-tmp-1727096244.716623-17332-73031952314235="` echo /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235 `" ) && sleep 0' 15500 1727096244.72264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096244.72286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096244.72310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096244.72329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096244.72347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096244.72359: stderr chunk (state=3): >>>debug2: match not found <<< 15500 1727096244.72378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096244.72473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096244.72493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096244.72510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096244.72730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096244.74625: stdout chunk (state=3): >>>ansible-tmp-1727096244.716623-17332-73031952314235=/root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235 <<< 15500 1727096244.74873: stdout chunk (state=3): >>><<< 15500 1727096244.74877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096244.74879: stderr chunk (state=3): >>><<< 15500 1727096244.74882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096244.716623-17332-73031952314235=/root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096244.74884: variable 'ansible_module_compression' from source: unknown 15500 1727096244.74887: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15500 1727096244.74894: variable 'ansible_facts' from source: unknown 15500 1727096244.74978: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/AnsiballZ_command.py 15500 1727096244.75151: Sending initial data 15500 1727096244.75161: Sent initial data (154 bytes) 15500 1727096244.75761: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096244.75874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096244.75894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096244.75912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096244.76006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096244.77573: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15500 1727096244.77607: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096244.77655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096244.77750: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpnqb56vhl /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/AnsiballZ_command.py <<< 15500 1727096244.77760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/AnsiballZ_command.py" <<< 15500 1727096244.77804: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmpnqb56vhl" to remote "/root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/AnsiballZ_command.py" <<< 15500 1727096244.78639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096244.78701: stderr chunk (state=3): >>><<< 15500 1727096244.78711: stdout chunk (state=3): >>><<< 15500 1727096244.78751: done transferring module to remote 15500 1727096244.78770: _low_level_execute_command(): starting 15500 1727096244.78841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/ /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/AnsiballZ_command.py && sleep 0' 15500 1727096244.79412: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096244.79428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096244.79446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096244.79480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096244.79502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15500 1727096244.79589: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096244.79606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096244.79625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096244.79639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096244.79741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096244.81606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096244.81661: stderr chunk (state=3): >>><<< 15500 1727096244.81681: stdout chunk (state=3): >>><<< 15500 1727096244.81712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096244.81720: _low_level_execute_command(): starting 15500 1727096244.81730: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/AnsiballZ_command.py && sleep 0' 15500 1727096244.82628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096244.82632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found <<< 15500 1727096244.82635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15500 1727096244.82637: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096244.82639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096244.82717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096244.82780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096244.99072: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3217sec preferred_lft 3217sec\n inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:57:24.980194", "end": "2024-09-23 08:57:24.989037", "delta": "0:00:00.008843", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15500 1727096245.00776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096245.00781: stdout chunk (state=3): >>><<< 15500 1727096245.00783: stderr chunk (state=3): >>><<< 15500 1727096245.00874: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3217sec preferred_lft 3217sec\n inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:57:24.980194", "end": "2024-09-23 08:57:24.989037", "delta": "0:00:00.008843", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096245.00883: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096245.00886: _low_level_execute_command(): starting 15500 1727096245.00888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096244.716623-17332-73031952314235/ > /dev/null 2>&1 && sleep 0' 15500 1727096245.01510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096245.01524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096245.01558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096245.01652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096245.01678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096245.01694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096245.01795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096245.03676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096245.03710: stderr chunk (state=3): >>><<< 15500 1727096245.03725: stdout chunk (state=3): >>><<< 15500 1727096245.03792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096245.03796: handler run complete 15500 1727096245.03798: Evaluated conditional (False): False 15500 1727096245.03800: attempt loop complete, returning result 15500 1727096245.03802: _execute() done 15500 1727096245.03804: dumping result to json 15500 1727096245.03815: done dumping result, returning 15500 1727096245.03827: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0afff68d-5257-877d-2da0-00000000050b] 15500 1727096245.03835: sending task result for task 0afff68d-5257-877d-2da0-00000000050b 15500 1727096245.04177: done sending task result for task 0afff68d-5257-877d-2da0-00000000050b 15500 1727096245.04181: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008843", "end": "2024-09-23 08:57:24.989037", "rc": 0, "start": "2024-09-23 08:57:24.980194" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:ff:ac:3f:90:f5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.125/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3217sec preferred_lft 3217sec inet6 fe80::10ff:acff:fe3f:90f5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.125 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.125 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 15500 1727096245.04249: no more pending results, returning what we have 15500 1727096245.04253: results queue empty 15500 1727096245.04254: checking for any_errors_fatal 15500 1727096245.04255: done checking for any_errors_fatal 15500 1727096245.04256: checking for max_fail_percentage 15500 1727096245.04258: done checking for max_fail_percentage 15500 1727096245.04259: checking to see if all hosts have failed and the running result is not ok 15500 1727096245.04260: done checking to see if all hosts have failed 15500 1727096245.04260: getting the remaining hosts for this loop 15500 1727096245.04262: done getting the remaining hosts for this loop 15500 1727096245.04265: getting the next task for host managed_node1 15500 1727096245.04274: done getting next task for host managed_node1 15500 1727096245.04277: ^ task is: TASK: Verify DNS and network connectivity 15500 1727096245.04280: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096245.04283: getting variables 15500 1727096245.04285: in VariableManager get_vars() 15500 1727096245.04315: Calling all_inventory to load vars for managed_node1 15500 1727096245.04318: Calling groups_inventory to load vars for managed_node1 15500 1727096245.04326: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096245.04336: Calling all_plugins_play to load vars for managed_node1 15500 1727096245.04339: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096245.04342: Calling groups_plugins_play to load vars for managed_node1 15500 1727096245.05819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096245.07431: done with get_vars() 15500 1727096245.07462: done getting variables 15500 1727096245.07536: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 08:57:25 -0400 (0:00:00.409) 0:00:45.118 ****** 15500 1727096245.07571: entering _queue_task() for managed_node1/shell 15500 1727096245.08169: worker is 1 (out of 1 available) 15500 1727096245.08179: exiting _queue_task() for managed_node1/shell 15500 1727096245.08190: done queuing things up, now waiting for results queue to drain 15500 1727096245.08192: waiting for pending results... 15500 1727096245.08258: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 15500 1727096245.08369: in run() - task 0afff68d-5257-877d-2da0-00000000050c 15500 1727096245.08390: variable 'ansible_search_path' from source: unknown 15500 1727096245.08398: variable 'ansible_search_path' from source: unknown 15500 1727096245.08447: calling self._execute() 15500 1727096245.08573: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096245.08577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096245.08580: variable 'omit' from source: magic vars 15500 1727096245.08975: variable 'ansible_distribution_major_version' from source: facts 15500 1727096245.08992: Evaluated conditional (ansible_distribution_major_version != '6'): True 15500 1727096245.09184: variable 'ansible_facts' from source: unknown 15500 1727096245.09903: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 15500 1727096245.09914: variable 'omit' from source: magic vars 15500 1727096245.09964: variable 'omit' from source: magic vars 15500 1727096245.10004: variable 'omit' from source: magic vars 15500 1727096245.10056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15500 1727096245.10097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15500 1727096245.10161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15500 1727096245.10164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096245.10166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15500 1727096245.10193: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15500 1727096245.10201: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096245.10208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096245.10320: Set connection var ansible_module_compression to ZIP_DEFLATED 15500 1727096245.10331: Set connection var ansible_pipelining to False 15500 1727096245.10340: Set connection var ansible_timeout to 10 15500 1727096245.10376: Set connection var ansible_shell_type to sh 15500 1727096245.10379: Set connection var ansible_shell_executable to /bin/sh 15500 1727096245.10381: Set connection var ansible_connection to ssh 15500 1727096245.10394: variable 'ansible_shell_executable' from source: unknown 15500 1727096245.10400: variable 'ansible_connection' from source: unknown 15500 1727096245.10406: variable 'ansible_module_compression' from source: unknown 15500 1727096245.10412: variable 'ansible_shell_type' from source: unknown 15500 1727096245.10418: variable 'ansible_shell_executable' from source: unknown 15500 1727096245.10473: variable 'ansible_host' from source: host vars for 'managed_node1' 15500 1727096245.10480: variable 'ansible_pipelining' from source: unknown 15500 1727096245.10486: variable 'ansible_timeout' from source: unknown 15500 1727096245.10488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15500 1727096245.10594: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096245.10610: variable 'omit' from source: magic vars 15500 1727096245.10619: starting attempt loop 15500 1727096245.10625: running the handler 15500 1727096245.10640: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15500 1727096245.10664: _low_level_execute_command(): starting 15500 1727096245.10681: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15500 1727096245.11395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096245.11400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration <<< 15500 1727096245.11403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096245.11405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096245.11408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096245.11482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096245.11486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096245.11583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096245.13342: stdout chunk (state=3): >>>/root <<< 15500 1727096245.13508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096245.13511: stdout chunk (state=3): >>><<< 15500 1727096245.13514: stderr chunk (state=3): >>><<< 15500 1727096245.13535: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096245.13560: _low_level_execute_command(): starting 15500 1727096245.13648: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341 `" && echo ansible-tmp-1727096245.1354551-17349-172848408720341="` echo /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341 `" ) && sleep 0' 15500 1727096245.14285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096245.14353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096245.14380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096245.14394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096245.14500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096245.16463: stdout chunk (state=3): >>>ansible-tmp-1727096245.1354551-17349-172848408720341=/root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341 <<< 15500 1727096245.16631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096245.16635: stdout chunk (state=3): >>><<< 15500 1727096245.16637: stderr chunk (state=3): >>><<< 15500 1727096245.16751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096245.1354551-17349-172848408720341=/root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096245.16754: variable 'ansible_module_compression' from source: unknown 15500 1727096245.16757: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15500q8g1lk73/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15500 1727096245.16806: variable 'ansible_facts' from source: unknown 15500 1727096245.17086: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/AnsiballZ_command.py 15500 1727096245.17606: Sending initial data 15500 1727096245.17611: Sent initial data (156 bytes) 15500 1727096245.18546: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096245.18775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096245.18936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096245.19131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096245.20808: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15500 1727096245.20894: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15500 1727096245.20939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp6eoidk8r /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/AnsiballZ_command.py <<< 15500 1727096245.20953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/AnsiballZ_command.py" <<< 15500 1727096245.21090: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15500q8g1lk73/tmp6eoidk8r" to remote "/root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/AnsiballZ_command.py" <<< 15500 1727096245.22396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096245.22435: stderr chunk (state=3): >>><<< 15500 1727096245.22578: stdout chunk (state=3): >>><<< 15500 1727096245.22604: done transferring module to remote 15500 1727096245.22621: _low_level_execute_command(): starting 15500 1727096245.22630: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/ /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/AnsiballZ_command.py && sleep 0' 15500 1727096245.23244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096245.23265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096245.23290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096245.23312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096245.23388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096245.23437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096245.23471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096245.23495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096245.23586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096245.25561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096245.25566: stdout chunk (state=3): >>><<< 15500 1727096245.25570: stderr chunk (state=3): >>><<< 15500 1727096245.25674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096245.25677: _low_level_execute_command(): starting 15500 1727096245.25680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/AnsiballZ_command.py && sleep 0' 15500 1727096245.26251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096245.26272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096245.26286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096245.26318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096245.26415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096245.26442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096245.26554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096245.86985: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1358 0 --:--:-- --:--:-- --:--:-- 1361\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1449 0 --:--:-- --:--:-- --:--:-- 1455", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:57:25.417100", "end": "2024-09-23 08:57:25.868255", "delta": "0:00:00.451155", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15500 1727096245.88762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. <<< 15500 1727096245.88766: stdout chunk (state=3): >>><<< 15500 1727096245.88773: stderr chunk (state=3): >>><<< 15500 1727096245.88797: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1358 0 --:--:-- --:--:-- --:--:-- 1361\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1449 0 --:--:-- --:--:-- --:--:-- 1455", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:57:25.417100", "end": "2024-09-23 08:57:25.868255", "delta": "0:00:00.451155", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.125 closed. 15500 1727096245.88870: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15500 1727096245.88953: _low_level_execute_command(): starting 15500 1727096245.88957: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096245.1354551-17349-172848408720341/ > /dev/null 2>&1 && sleep 0' 15500 1727096245.89497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15500 1727096245.89518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15500 1727096245.89534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15500 1727096245.89553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15500 1727096245.89572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 <<< 15500 1727096245.89624: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15500 1727096245.89685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' <<< 15500 1727096245.89702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15500 1727096245.89732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15500 1727096245.89839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15500 1727096245.91807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15500 1727096245.91823: stdout chunk (state=3): >>><<< 15500 1727096245.91838: stderr chunk (state=3): >>><<< 15500 1727096245.91860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.125 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.125 originally 10.31.11.125 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/8e98a30b23' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15500 1727096245.91875: handler run complete 15500 1727096245.91904: Evaluated conditional (False): False 15500 1727096245.91931: attempt loop complete, returning result 15500 1727096245.91939: _execute() done 15500 1727096245.91945: dumping result to json 15500 1727096245.92031: done dumping result, returning 15500 1727096245.92035: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0afff68d-5257-877d-2da0-00000000050c] 15500 1727096245.92037: sending task result for task 0afff68d-5257-877d-2da0-00000000050c 15500 1727096245.92114: done sending task result for task 0afff68d-5257-877d-2da0-00000000050c 15500 1727096245.92118: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.451155", "end": "2024-09-23 08:57:25.868255", "rc": 0, "start": "2024-09-23 08:57:25.417100" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1358 0 --:--:-- --:--:-- --:--:-- 1361 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1449 0 --:--:-- --:--:-- --:--:-- 1455 15500 1727096245.92191: no more pending results, returning what we have 15500 1727096245.92196: results queue empty 15500 1727096245.92197: checking for any_errors_fatal 15500 1727096245.92207: done checking for any_errors_fatal 15500 1727096245.92208: checking for max_fail_percentage 15500 1727096245.92210: done checking for max_fail_percentage 15500 1727096245.92211: checking to see if all hosts have failed and the running result is not ok 15500 1727096245.92212: done checking to see if all hosts have failed 15500 1727096245.92212: getting the remaining hosts for this loop 15500 1727096245.92218: done getting the remaining hosts for this loop 15500 1727096245.92222: getting the next task for host managed_node1 15500 1727096245.92232: done getting next task for host managed_node1 15500 1727096245.92235: ^ task is: TASK: meta (flush_handlers) 15500 1727096245.92238: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096245.92243: getting variables 15500 1727096245.92245: in VariableManager get_vars() 15500 1727096245.92276: Calling all_inventory to load vars for managed_node1 15500 1727096245.92279: Calling groups_inventory to load vars for managed_node1 15500 1727096245.92283: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096245.92295: Calling all_plugins_play to load vars for managed_node1 15500 1727096245.92299: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096245.92302: Calling groups_plugins_play to load vars for managed_node1 15500 1727096245.94273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096245.95924: done with get_vars() 15500 1727096245.95950: done getting variables 15500 1727096245.96046: in VariableManager get_vars() 15500 1727096245.96059: Calling all_inventory to load vars for managed_node1 15500 1727096245.96061: Calling groups_inventory to load vars for managed_node1 15500 1727096245.96070: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096245.96078: Calling all_plugins_play to load vars for managed_node1 15500 1727096245.96082: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096245.96087: Calling groups_plugins_play to load vars for managed_node1 15500 1727096245.97269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096245.98835: done with get_vars() 15500 1727096245.98870: done queuing things up, now waiting for results queue to drain 15500 1727096245.98874: results queue empty 15500 1727096245.98875: checking for any_errors_fatal 15500 1727096245.98879: done checking for any_errors_fatal 15500 1727096245.98880: checking for max_fail_percentage 15500 1727096245.98881: done checking for max_fail_percentage 15500 1727096245.98882: checking to see if all hosts have failed and the running result is not ok 15500 1727096245.98883: done checking to see if all hosts have failed 15500 1727096245.98883: getting the remaining hosts for this loop 15500 1727096245.98884: done getting the remaining hosts for this loop 15500 1727096245.98887: getting the next task for host managed_node1 15500 1727096245.98891: done getting next task for host managed_node1 15500 1727096245.98893: ^ task is: TASK: meta (flush_handlers) 15500 1727096245.98894: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096245.98899: getting variables 15500 1727096245.98900: in VariableManager get_vars() 15500 1727096245.98909: Calling all_inventory to load vars for managed_node1 15500 1727096245.98911: Calling groups_inventory to load vars for managed_node1 15500 1727096245.98913: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096245.98918: Calling all_plugins_play to load vars for managed_node1 15500 1727096245.98921: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096245.98923: Calling groups_plugins_play to load vars for managed_node1 15500 1727096246.00772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096246.02350: done with get_vars() 15500 1727096246.02379: done getting variables 15500 1727096246.02431: in VariableManager get_vars() 15500 1727096246.02440: Calling all_inventory to load vars for managed_node1 15500 1727096246.02442: Calling groups_inventory to load vars for managed_node1 15500 1727096246.02445: Calling all_plugins_inventory to load vars for managed_node1 15500 1727096246.02450: Calling all_plugins_play to load vars for managed_node1 15500 1727096246.02452: Calling groups_plugins_inventory to load vars for managed_node1 15500 1727096246.02455: Calling groups_plugins_play to load vars for managed_node1 15500 1727096246.03678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15500 1727096246.05379: done with get_vars() 15500 1727096246.05407: done queuing things up, now waiting for results queue to drain 15500 1727096246.05410: results queue empty 15500 1727096246.05410: checking for any_errors_fatal 15500 1727096246.05412: done checking for any_errors_fatal 15500 1727096246.05413: checking for max_fail_percentage 15500 1727096246.05414: done checking for max_fail_percentage 15500 1727096246.05415: checking to see if all hosts have failed and the running result is not ok 15500 1727096246.05415: done checking to see if all hosts have failed 15500 1727096246.05416: getting the remaining hosts for this loop 15500 1727096246.05417: done getting the remaining hosts for this loop 15500 1727096246.05420: getting the next task for host managed_node1 15500 1727096246.05423: done getting next task for host managed_node1 15500 1727096246.05424: ^ task is: None 15500 1727096246.05425: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15500 1727096246.05426: done queuing things up, now waiting for results queue to drain 15500 1727096246.05427: results queue empty 15500 1727096246.05428: checking for any_errors_fatal 15500 1727096246.05428: done checking for any_errors_fatal 15500 1727096246.05429: checking for max_fail_percentage 15500 1727096246.05430: done checking for max_fail_percentage 15500 1727096246.05431: checking to see if all hosts have failed and the running result is not ok 15500 1727096246.05431: done checking to see if all hosts have failed 15500 1727096246.05432: getting the next task for host managed_node1 15500 1727096246.05435: done getting next task for host managed_node1 15500 1727096246.05436: ^ task is: None 15500 1727096246.05437: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=82 changed=3 unreachable=0 failed=0 skipped=71 rescued=0 ignored=2 Monday 23 September 2024 08:57:26 -0400 (0:00:00.979) 0:00:46.098 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.18s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.98s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.70s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Gather current interface info ------------------------------------------- 1.62s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.08s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.07s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.00s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Verify DNS and network connectivity ------------------------------------- 0.98s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 15500 1727096246.05546: RUNNING CLEANUP