2025-03-22 21:30:39.725144 | Job console starting... 2025-03-22 21:30:39.732842 | Updating repositories 2025-03-22 21:30:39.775277 | Preparing job workspace 2025-03-22 21:30:41.364988 | Running Ansible setup... 2025-03-22 21:30:46.153786 | PRE-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/pre.yaml@main] 2025-03-22 21:30:46.781294 | 2025-03-22 21:30:46.781418 | PLAY [Base pre] 2025-03-22 21:30:46.810271 | 2025-03-22 21:30:46.810397 | TASK [Setup log path fact] 2025-03-22 21:30:46.846801 | orchestrator | ok 2025-03-22 21:30:46.866797 | 2025-03-22 21:30:46.866915 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-03-22 21:30:46.910302 | orchestrator | ok 2025-03-22 21:30:46.927701 | 2025-03-22 21:30:46.927798 | TASK [emit-job-header : Print job information] 2025-03-22 21:30:46.981143 | # Job Information 2025-03-22 21:30:46.981286 | Ansible Version: 2.15.3 2025-03-22 21:30:46.981319 | Job: testbed-deploy-stable-in-a-nutshell-ubuntu-24.04 2025-03-22 21:30:46.981375 | Pipeline: post 2025-03-22 21:30:46.981401 | Executor: 7d211f194f6a 2025-03-22 21:30:46.981420 | Triggered by: https://github.com/osism/testbed/commit/a3ba64258433d8b872eddcefffe90d03148a7a4f 2025-03-22 21:30:46.981439 | Event ID: e25c8768-0764-11f0-96d2-d3b352d72870 2025-03-22 21:30:46.988414 | 2025-03-22 21:30:46.988508 | LOOP [emit-job-header : Print node information] 2025-03-22 21:30:47.157521 | orchestrator | ok: 2025-03-22 21:30:47.157683 | orchestrator | # Node Information 2025-03-22 21:30:47.157718 | orchestrator | Inventory Hostname: orchestrator 2025-03-22 21:30:47.157743 | orchestrator | Hostname: zuul-static-regiocloud-infra-1 2025-03-22 21:30:47.157765 | orchestrator | Username: zuul-testbed06 2025-03-22 21:30:47.157784 | orchestrator | Distro: Debian 12.10 2025-03-22 21:30:47.157807 | orchestrator | Provider: static-testbed 2025-03-22 21:30:47.157827 | orchestrator | Label: testbed-orchestrator 2025-03-22 21:30:47.157848 | orchestrator | Product Name: OpenStack Nova 2025-03-22 21:30:47.157867 | orchestrator | Interface IP: 81.163.193.140 2025-03-22 21:30:47.182764 | 2025-03-22 21:30:47.183001 | TASK [log-inventory : Ensure Zuul Ansible directory exists] 2025-03-22 21:30:47.719473 | orchestrator -> localhost | changed 2025-03-22 21:30:47.731919 | 2025-03-22 21:30:47.732532 | TASK [log-inventory : Copy ansible inventory to logs dir] 2025-03-22 21:30:48.900494 | orchestrator -> localhost | changed 2025-03-22 21:30:48.916491 | 2025-03-22 21:30:48.916620 | TASK [add-build-sshkey : Check to see if ssh key was already created for this build] 2025-03-22 21:30:49.224764 | orchestrator -> localhost | ok 2025-03-22 21:30:49.246068 | 2025-03-22 21:30:49.246249 | TASK [add-build-sshkey : Create a new key in workspace based on build UUID] 2025-03-22 21:30:49.283069 | orchestrator | ok 2025-03-22 21:30:49.301486 | orchestrator | included: /var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/trusted/project_1/opendev.org/zuul/zuul-jobs/roles/add-build-sshkey/tasks/create-key-and-replace.yaml 2025-03-22 21:30:49.311151 | 2025-03-22 21:30:49.311252 | TASK [add-build-sshkey : Create Temp SSH key] 2025-03-22 21:30:50.359031 | orchestrator -> localhost | Generating public/private rsa key pair. 2025-03-22 21:30:50.359263 | orchestrator -> localhost | Your identification has been saved in /var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/work/0ca07ad8f44c4220a2f2e2b88aef6038_id_rsa 2025-03-22 21:30:50.359300 | orchestrator -> localhost | Your public key has been saved in /var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/work/0ca07ad8f44c4220a2f2e2b88aef6038_id_rsa.pub 2025-03-22 21:30:50.359324 | orchestrator -> localhost | The key fingerprint is: 2025-03-22 21:30:50.359346 | orchestrator -> localhost | SHA256:v4Ve/SUFudjkFa13BOIlR4EetD+S323tLK88JmG/uG0 zuul-build-sshkey 2025-03-22 21:30:50.359408 | orchestrator -> localhost | The key's randomart image is: 2025-03-22 21:30:50.359430 | orchestrator -> localhost | +---[RSA 3072]----+ 2025-03-22 21:30:50.359451 | orchestrator -> localhost | | .+o*+.| 2025-03-22 21:30:50.359471 | orchestrator -> localhost | | .o* .+| 2025-03-22 21:30:50.359502 | orchestrator -> localhost | | .o.+o.| 2025-03-22 21:30:50.359522 | orchestrator -> localhost | | .B.+o| 2025-03-22 21:30:50.359541 | orchestrator -> localhost | | S + *.o| 2025-03-22 21:30:50.359560 | orchestrator -> localhost | | . .o+ +o| 2025-03-22 21:30:50.359585 | orchestrator -> localhost | | o.oo+ *| 2025-03-22 21:30:50.359605 | orchestrator -> localhost | | . +.+E*.| 2025-03-22 21:30:50.359624 | orchestrator -> localhost | | o o==*=| 2025-03-22 21:30:50.359643 | orchestrator -> localhost | +----[SHA256]-----+ 2025-03-22 21:30:50.359700 | orchestrator -> localhost | ok: Runtime: 0:00:00.522164 2025-03-22 21:30:50.369392 | 2025-03-22 21:30:50.369513 | TASK [add-build-sshkey : Remote setup ssh keys (linux)] 2025-03-22 21:30:50.401767 | orchestrator | ok 2025-03-22 21:30:50.414231 | orchestrator | included: /var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/trusted/project_1/opendev.org/zuul/zuul-jobs/roles/add-build-sshkey/tasks/remote-linux.yaml 2025-03-22 21:30:50.427109 | 2025-03-22 21:30:50.427251 | TASK [add-build-sshkey : Remove previously added zuul-build-sshkey] 2025-03-22 21:30:50.452453 | orchestrator | skipping: Conditional result was False 2025-03-22 21:30:50.461866 | 2025-03-22 21:30:50.461993 | TASK [add-build-sshkey : Enable access via build key on all nodes] 2025-03-22 21:30:51.119649 | orchestrator | changed 2025-03-22 21:30:51.130748 | 2025-03-22 21:30:51.130879 | TASK [add-build-sshkey : Make sure user has a .ssh] 2025-03-22 21:30:51.427104 | orchestrator | ok 2025-03-22 21:30:51.436841 | 2025-03-22 21:30:51.436974 | TASK [add-build-sshkey : Install build private key as SSH key on all nodes] 2025-03-22 21:30:51.847159 | orchestrator | ok 2025-03-22 21:30:51.856063 | 2025-03-22 21:30:51.856182 | TASK [add-build-sshkey : Install build public key as SSH key on all nodes] 2025-03-22 21:30:52.236878 | orchestrator | ok 2025-03-22 21:30:52.246093 | 2025-03-22 21:30:52.246212 | TASK [add-build-sshkey : Remote setup ssh keys (windows)] 2025-03-22 21:30:52.282051 | orchestrator | skipping: Conditional result was False 2025-03-22 21:30:52.330457 | 2025-03-22 21:30:52.330585 | TASK [remove-zuul-sshkey : Remove master key from local agent] 2025-03-22 21:30:53.124837 | orchestrator -> localhost | changed 2025-03-22 21:30:53.178559 | 2025-03-22 21:30:53.178710 | TASK [add-build-sshkey : Add back temp key] 2025-03-22 21:30:53.739648 | orchestrator -> localhost | Identity added: /var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/work/0ca07ad8f44c4220a2f2e2b88aef6038_id_rsa (zuul-build-sshkey) 2025-03-22 21:30:53.739911 | orchestrator -> localhost | ok: Runtime: 0:00:00.024733 2025-03-22 21:30:53.753017 | 2025-03-22 21:30:53.753138 | TASK [add-build-sshkey : Verify we can still SSH to all nodes] 2025-03-22 21:30:54.133650 | orchestrator | ok 2025-03-22 21:30:54.151179 | 2025-03-22 21:30:54.151481 | TASK [add-build-sshkey : Verify we can still SSH to all nodes (windows)] 2025-03-22 21:30:54.247404 | orchestrator | skipping: Conditional result was False 2025-03-22 21:30:54.287576 | 2025-03-22 21:30:54.287752 | TASK [start-zuul-console : Start zuul_console daemon.] 2025-03-22 21:30:54.691661 | orchestrator | ok 2025-03-22 21:30:54.721580 | 2025-03-22 21:30:54.721735 | TASK [validate-host : Define zuul_info_dir fact] 2025-03-22 21:30:54.821449 | orchestrator | ok 2025-03-22 21:30:54.831294 | 2025-03-22 21:30:54.831455 | TASK [validate-host : Ensure Zuul Ansible directory exists] 2025-03-22 21:30:55.226099 | orchestrator -> localhost | ok 2025-03-22 21:30:55.234731 | 2025-03-22 21:30:55.234828 | TASK [validate-host : Collect information about the host] 2025-03-22 21:30:56.281890 | orchestrator | ok 2025-03-22 21:30:56.296962 | 2025-03-22 21:30:56.297055 | TASK [validate-host : Sanitize hostname] 2025-03-22 21:30:56.379460 | orchestrator | ok 2025-03-22 21:30:56.386755 | 2025-03-22 21:30:56.386849 | TASK [validate-host : Write out all ansible variables/facts known for each host] 2025-03-22 21:30:56.899122 | orchestrator -> localhost | changed 2025-03-22 21:30:56.907101 | 2025-03-22 21:30:56.907201 | TASK [validate-host : Collect information about zuul worker] 2025-03-22 21:30:57.468681 | orchestrator | ok 2025-03-22 21:30:57.496564 | 2025-03-22 21:30:57.496667 | TASK [validate-host : Write out all zuul information for each host] 2025-03-22 21:30:58.380910 | orchestrator -> localhost | changed 2025-03-22 21:30:58.405201 | 2025-03-22 21:30:58.405304 | TASK [prepare-workspace-log : Start zuul_console daemon.] 2025-03-22 21:30:58.664425 | orchestrator | ok 2025-03-22 21:30:58.671330 | 2025-03-22 21:30:58.671445 | TASK [prepare-workspace-log : Synchronize src repos to workspace directory.] 2025-03-22 21:31:43.589412 | orchestrator | changed: 2025-03-22 21:31:43.589626 | orchestrator | .d..t...... src/ 2025-03-22 21:31:43.589663 | orchestrator | .d..t...... src/github.com/ 2025-03-22 21:31:43.589687 | orchestrator | .d..t...... src/github.com/osism/ 2025-03-22 21:31:43.589708 | orchestrator | .d..t...... src/github.com/osism/ansible-collection-commons/ 2025-03-22 21:31:43.589728 | orchestrator | RedHat.yml 2025-03-22 21:31:43.604587 | orchestrator | .L..t...... src/github.com/osism/ansible-collection-commons/roles/repository/tasks/CentOS.yml -> RedHat.yml 2025-03-22 21:31:43.604604 | orchestrator | RedHat.yml 2025-03-22 21:31:43.604657 | orchestrator | = 1.53.0"... 2025-03-22 21:31:55.760843 | orchestrator | 21:31:55.760 STDOUT terraform: - Finding hashicorp/local versions matching ">= 2.2.0"... 2025-03-22 21:31:56.724728 | orchestrator | 21:31:56.724 STDOUT terraform: - Installing hashicorp/null v3.2.3... 2025-03-22 21:31:57.418379 | orchestrator | 21:31:57.418 STDOUT terraform: - Installed hashicorp/null v3.2.3 (signed, key ID 0C0AF313E5FD9F80) 2025-03-22 21:31:58.597646 | orchestrator | 21:31:58.597 STDOUT terraform: - Installing terraform-provider-openstack/openstack v3.0.0... 2025-03-22 21:31:59.729693 | orchestrator | 21:31:59.729 STDOUT terraform: - Installed terraform-provider-openstack/openstack v3.0.0 (signed, key ID 4F80527A391BEFD2) 2025-03-22 21:32:00.707062 | orchestrator | 21:32:00.706 STDOUT terraform: - Installing hashicorp/local v2.5.2... 2025-03-22 21:32:01.474097 | orchestrator | 21:32:01.473 STDOUT terraform: - Installed hashicorp/local v2.5.2 (signed, key ID 0C0AF313E5FD9F80) 2025-03-22 21:32:01.474522 | orchestrator | 21:32:01.473 STDOUT terraform: Providers are signed by their developers. 2025-03-22 21:32:01.474536 | orchestrator | 21:32:01.473 STDOUT terraform: If you'd like to know more about provider signing, you can read about it here: 2025-03-22 21:32:01.474543 | orchestrator | 21:32:01.473 STDOUT terraform: https://opentofu.org/docs/cli/plugins/signing/ 2025-03-22 21:32:01.474550 | orchestrator | 21:32:01.473 STDOUT terraform: OpenTofu has created a lock file .terraform.lock.hcl to record the provider 2025-03-22 21:32:01.474556 | orchestrator | 21:32:01.473 STDOUT terraform: selections it made above. Include this file in your version control repository 2025-03-22 21:32:01.474564 | orchestrator | 21:32:01.473 STDOUT terraform: so that OpenTofu can guarantee to make the same selections by default when 2025-03-22 21:32:01.619795 | orchestrator | 21:32:01.474 STDOUT terraform: you run "tofu init" in the future. 2025-03-22 21:32:01.619908 | orchestrator | 21:32:01.474 STDOUT terraform: OpenTofu has been successfully initialized! 2025-03-22 21:32:01.619917 | orchestrator | 21:32:01.474 STDOUT terraform: You may now begin working with OpenTofu. Try running "tofu plan" to see 2025-03-22 21:32:01.619923 | orchestrator | 21:32:01.474 STDOUT terraform: any changes that are required for your infrastructure. All OpenTofu commands 2025-03-22 21:32:01.619929 | orchestrator | 21:32:01.474 STDOUT terraform: should now work. 2025-03-22 21:32:01.619939 | orchestrator | 21:32:01.474 STDOUT terraform: If you ever set or change modules or backend configuration for OpenTofu, 2025-03-22 21:32:01.619945 | orchestrator | 21:32:01.474 STDOUT terraform: rerun this command to reinitialize your working directory. If you forget, other 2025-03-22 21:32:01.619950 | orchestrator | 21:32:01.474 STDOUT terraform: commands will detect it and remind you to do so if necessary. 2025-03-22 21:32:01.619980 | orchestrator | 21:32:01.619 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed06/terraform` instead. 2025-03-22 21:32:01.788938 | orchestrator | 21:32:01.788 STDOUT terraform: Created and switched to workspace "ci"! 2025-03-22 21:32:01.926567 | orchestrator | 21:32:01.788 STDOUT terraform: You're now on a new, empty workspace. Workspaces isolate their state, 2025-03-22 21:32:01.926641 | orchestrator | 21:32:01.788 STDOUT terraform: so if you run "tofu plan" OpenTofu will not see any existing state 2025-03-22 21:32:01.926649 | orchestrator | 21:32:01.788 STDOUT terraform: for this configuration. 2025-03-22 21:32:01.926672 | orchestrator | 21:32:01.926 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed06/terraform` instead. 2025-03-22 21:32:02.024128 | orchestrator | 21:32:02.023 STDOUT terraform: ci.auto.tfvars 2025-03-22 21:32:02.191473 | orchestrator | 21:32:02.191 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed06/terraform` instead. 2025-03-22 21:32:02.951432 | orchestrator | 21:32:02.951 STDOUT terraform: data.openstack_networking_network_v2.public: Reading... 2025-03-22 21:32:03.475460 | orchestrator | 21:32:03.475 STDOUT terraform: data.openstack_networking_network_v2.public: Read complete after 0s [id=e6be7364-bfd8-4de7-8120-8f41c69a139a] 2025-03-22 21:32:03.739454 | orchestrator | 21:32:03.739 STDOUT terraform: OpenTofu used the selected providers to generate the following execution 2025-03-22 21:32:03.739538 | orchestrator | 21:32:03.739 STDOUT terraform: plan. Resource actions are indicated with the following symbols: 2025-03-22 21:32:03.739572 | orchestrator | 21:32:03.739 STDOUT terraform:  + create 2025-03-22 21:32:03.739655 | orchestrator | 21:32:03.739 STDOUT terraform:  <= read (data resources) 2025-03-22 21:32:03.739723 | orchestrator | 21:32:03.739 STDOUT terraform: OpenTofu will perform the following actions: 2025-03-22 21:32:03.739857 | orchestrator | 21:32:03.739 STDOUT terraform:  # data.openstack_images_image_v2.image will be read during apply 2025-03-22 21:32:03.739930 | orchestrator | 21:32:03.739 STDOUT terraform:  # (config refers to values not yet known) 2025-03-22 21:32:03.740011 | orchestrator | 21:32:03.739 STDOUT terraform:  <= data "openstack_images_image_v2" "image" { 2025-03-22 21:32:03.740098 | orchestrator | 21:32:03.739 STDOUT terraform:  + checksum = (known after apply) 2025-03-22 21:32:03.740171 | orchestrator | 21:32:03.740 STDOUT terraform:  + created_at = (known after apply) 2025-03-22 21:32:03.740232 | orchestrator | 21:32:03.740 STDOUT terraform:  + file = (known after apply) 2025-03-22 21:32:03.740303 | orchestrator | 21:32:03.740 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.740398 | orchestrator | 21:32:03.740 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.740496 | orchestrator | 21:32:03.740 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-03-22 21:32:03.740569 | orchestrator | 21:32:03.740 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-03-22 21:32:03.740647 | orchestrator | 21:32:03.740 STDOUT terraform:  + most_recent = true 2025-03-22 21:32:03.740709 | orchestrator | 21:32:03.740 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.740774 | orchestrator | 21:32:03.740 STDOUT terraform:  + protected = (known after apply) 2025-03-22 21:32:03.740855 | orchestrator | 21:32:03.740 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.740928 | orchestrator | 21:32:03.740 STDOUT terraform:  + schema = (known after apply) 2025-03-22 21:32:03.740998 | orchestrator | 21:32:03.740 STDOUT terraform:  + size_bytes = (known after apply) 2025-03-22 21:32:03.741082 | orchestrator | 21:32:03.740 STDOUT terraform:  + tags = (known after apply) 2025-03-22 21:32:03.741156 | orchestrator | 21:32:03.741 STDOUT terraform:  + updated_at = (known after apply) 2025-03-22 21:32:03.741191 | orchestrator | 21:32:03.741 STDOUT terraform:  } 2025-03-22 21:32:03.741370 | orchestrator | 21:32:03.741 STDOUT terraform:  # data.openstack_images_image_v2.image_node will be read during apply 2025-03-22 21:32:03.741449 | orchestrator | 21:32:03.741 STDOUT terraform:  # (config refers to values not yet known) 2025-03-22 21:32:03.741550 | orchestrator | 21:32:03.741 STDOUT terraform:  <= data "openstack_images_image_v2" "image_node" { 2025-03-22 21:32:03.741628 | orchestrator | 21:32:03.741 STDOUT terraform:  + checksum = (known after apply) 2025-03-22 21:32:03.741693 | orchestrator | 21:32:03.741 STDOUT terraform:  + created_at = (known after apply) 2025-03-22 21:32:03.741771 | orchestrator | 21:32:03.741 STDOUT terraform:  + file = (known after apply) 2025-03-22 21:32:03.741857 | orchestrator | 21:32:03.741 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.741928 | orchestrator | 21:32:03.741 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.741985 | orchestrator | 21:32:03.741 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-03-22 21:32:03.742089 | orchestrator | 21:32:03.741 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-03-22 21:32:03.742155 | orchestrator | 21:32:03.742 STDOUT terraform:  + most_recent = true 2025-03-22 21:32:03.742227 | orchestrator | 21:32:03.742 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.742309 | orchestrator | 21:32:03.742 STDOUT terraform:  + protected = (known after apply) 2025-03-22 21:32:03.742447 | orchestrator | 21:32:03.742 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.742510 | orchestrator | 21:32:03.742 STDOUT terraform:  + schema = (known after apply) 2025-03-22 21:32:03.742587 | orchestrator | 21:32:03.742 STDOUT terraform:  + size_bytes = (known after apply) 2025-03-22 21:32:03.742649 | orchestrator | 21:32:03.742 STDOUT terraform:  + tags = (known after apply) 2025-03-22 21:32:03.742721 | orchestrator | 21:32:03.742 STDOUT terraform:  + updated_at = (known after apply) 2025-03-22 21:32:03.742748 | orchestrator | 21:32:03.742 STDOUT terraform:  } 2025-03-22 21:32:03.742812 | orchestrator | 21:32:03.742 STDOUT terraform:  # local_file.MANAGER_ADDRESS will be created 2025-03-22 21:32:03.742887 | orchestrator | 21:32:03.742 STDOUT terraform:  + resource "local_file" "MANAGER_ADDRESS" { 2025-03-22 21:32:03.750140 | orchestrator | 21:32:03.742 STDOUT terraform:  + content = (known after apply) 2025-03-22 21:32:03.750195 | orchestrator | 21:32:03.742 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-22 21:32:03.750215 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-22 21:32:03.750221 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-22 21:32:03.750230 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-22 21:32:03.750236 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-22 21:32:03.750241 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-22 21:32:03.750246 | orchestrator | 21:32:03.743 STDOUT terraform:  + directory_permission = "0777" 2025-03-22 21:32:03.750252 | orchestrator | 21:32:03.743 STDOUT terraform:  + file_permission = "0644" 2025-03-22 21:32:03.750258 | orchestrator | 21:32:03.743 STDOUT terraform:  + filename = ".MANAGER_ADDRESS.ci" 2025-03-22 21:32:03.750264 | orchestrator | 21:32:03.743 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750269 | orchestrator | 21:32:03.743 STDOUT terraform:  } 2025-03-22 21:32:03.750274 | orchestrator | 21:32:03.743 STDOUT terraform:  # local_file.id_rsa_pub will be created 2025-03-22 21:32:03.750280 | orchestrator | 21:32:03.743 STDOUT terraform:  + resource "local_file" "id_rsa_pub" { 2025-03-22 21:32:03.750284 | orchestrator | 21:32:03.743 STDOUT terraform:  + content = (known after apply) 2025-03-22 21:32:03.750289 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-22 21:32:03.750294 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-22 21:32:03.750299 | orchestrator | 21:32:03.743 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-22 21:32:03.750304 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-22 21:32:03.750309 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-22 21:32:03.750314 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-22 21:32:03.750319 | orchestrator | 21:32:03.744 STDOUT terraform:  + directory_permission = "0777" 2025-03-22 21:32:03.750323 | orchestrator | 21:32:03.744 STDOUT terraform:  + file_permission = "0644" 2025-03-22 21:32:03.750344 | orchestrator | 21:32:03.744 STDOUT terraform:  + filename = ".id_rsa.ci.pub" 2025-03-22 21:32:03.750349 | orchestrator | 21:32:03.744 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750354 | orchestrator | 21:32:03.744 STDOUT terraform:  } 2025-03-22 21:32:03.750359 | orchestrator | 21:32:03.744 STDOUT terraform:  # local_file.inventory will be created 2025-03-22 21:32:03.750364 | orchestrator | 21:32:03.744 STDOUT terraform:  + resource "local_file" "inventory" { 2025-03-22 21:32:03.750369 | orchestrator | 21:32:03.744 STDOUT terraform:  + content = (known after apply) 2025-03-22 21:32:03.750374 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-22 21:32:03.750379 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-22 21:32:03.750384 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-22 21:32:03.750393 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-22 21:32:03.750398 | orchestrator | 21:32:03.744 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-22 21:32:03.750403 | orchestrator | 21:32:03.745 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-22 21:32:03.750407 | orchestrator | 21:32:03.745 STDOUT terraform:  + directory_permission = "0777" 2025-03-22 21:32:03.750412 | orchestrator | 21:32:03.745 STDOUT terraform:  + file_permission = "0644" 2025-03-22 21:32:03.750425 | orchestrator | 21:32:03.745 STDOUT terraform:  + filename = "inventory.ci" 2025-03-22 21:32:03.750431 | orchestrator | 21:32:03.745 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750437 | orchestrator | 21:32:03.746 STDOUT terraform:  } 2025-03-22 21:32:03.750442 | orchestrator | 21:32:03.746 STDOUT terraform:  # local_sensitive_file.id_rsa will be created 2025-03-22 21:32:03.750447 | orchestrator | 21:32:03.746 STDOUT terraform:  + resource "local_sensitive_file" "id_rsa" { 2025-03-22 21:32:03.750453 | orchestrator | 21:32:03.746 STDOUT terraform:  + content = (sensitive value) 2025-03-22 21:32:03.750458 | orchestrator | 21:32:03.746 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-03-22 21:32:03.750464 | orchestrator | 21:32:03.746 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-03-22 21:32:03.750470 | orchestrator | 21:32:03.746 STDOUT terraform:  + content_md5 = (known after apply) 2025-03-22 21:32:03.750480 | orchestrator | 21:32:03.746 STDOUT terraform:  + content_sha1 = (known after apply) 2025-03-22 21:32:03.750485 | orchestrator | 21:32:03.746 STDOUT terraform:  + content_sha256 = (known after apply) 2025-03-22 21:32:03.750490 | orchestrator | 21:32:03.746 STDOUT terraform:  + content_sha512 = (known after apply) 2025-03-22 21:32:03.750495 | orchestrator | 21:32:03.746 STDOUT terraform:  + directory_permission = "0700" 2025-03-22 21:32:03.750500 | orchestrator | 21:32:03.746 STDOUT terraform:  + file_permission = "0600" 2025-03-22 21:32:03.750505 | orchestrator | 21:32:03.746 STDOUT terraform:  + filename = ".id_rsa.ci" 2025-03-22 21:32:03.750510 | orchestrator | 21:32:03.746 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750515 | orchestrator | 21:32:03.746 STDOUT terraform:  } 2025-03-22 21:32:03.750520 | orchestrator | 21:32:03.746 STDOUT terraform:  # null_resource.node_semaphore will be created 2025-03-22 21:32:03.750525 | orchestrator | 21:32:03.746 STDOUT terraform:  + resource "null_resource" "node_semaphore" { 2025-03-22 21:32:03.750530 | orchestrator | 21:32:03.746 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750535 | orchestrator | 21:32:03.746 STDOUT terraform:  } 2025-03-22 21:32:03.750540 | orchestrator | 21:32:03.746 STDOUT terraform:  # openstack_blockstorage_volume_v3.manager_base_volume[0] will be created 2025-03-22 21:32:03.750545 | orchestrator | 21:32:03.747 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "manager_base_volume" { 2025-03-22 21:32:03.750550 | orchestrator | 21:32:03.747 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.750555 | orchestrator | 21:32:03.747 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.750563 | orchestrator | 21:32:03.747 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750568 | orchestrator | 21:32:03.747 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.750573 | orchestrator | 21:32:03.747 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.750578 | orchestrator | 21:32:03.747 STDOUT terraform:  + name = "testbed-volume-manager-base" 2025-03-22 21:32:03.750583 | orchestrator | 21:32:03.747 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.750588 | orchestrator | 21:32:03.747 STDOUT terraform:  + size = 80 2025-03-22 21:32:03.750593 | orchestrator | 21:32:03.747 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.750598 | orchestrator | 21:32:03.747 STDOUT terraform:  } 2025-03-22 21:32:03.750603 | orchestrator | 21:32:03.747 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[0] will be created 2025-03-22 21:32:03.750607 | orchestrator | 21:32:03.747 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-22 21:32:03.750612 | orchestrator | 21:32:03.747 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.750617 | orchestrator | 21:32:03.747 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.750622 | orchestrator | 21:32:03.747 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750630 | orchestrator | 21:32:03.747 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.750636 | orchestrator | 21:32:03.747 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.750641 | orchestrator | 21:32:03.747 STDOUT terraform:  + name = "testbed-volume-0-node-base" 2025-03-22 21:32:03.750645 | orchestrator | 21:32:03.747 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.750650 | orchestrator | 21:32:03.747 STDOUT terraform:  + size = 80 2025-03-22 21:32:03.750655 | orchestrator | 21:32:03.747 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.750660 | orchestrator | 21:32:03.748 STDOUT terraform:  } 2025-03-22 21:32:03.750665 | orchestrator | 21:32:03.748 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[1] will be created 2025-03-22 21:32:03.750670 | orchestrator | 21:32:03.748 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-22 21:32:03.750675 | orchestrator | 21:32:03.748 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.750680 | orchestrator | 21:32:03.748 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.750685 | orchestrator | 21:32:03.748 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750690 | orchestrator | 21:32:03.748 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.750695 | orchestrator | 21:32:03.748 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.750699 | orchestrator | 21:32:03.748 STDOUT terraform:  + name = "testbed-volume-1-node-base" 2025-03-22 21:32:03.750706 | orchestrator | 21:32:03.748 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.750714 | orchestrator | 21:32:03.748 STDOUT terraform:  + size = 80 2025-03-22 21:32:03.750719 | orchestrator | 21:32:03.748 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.750724 | orchestrator | 21:32:03.748 STDOUT terraform:  } 2025-03-22 21:32:03.750729 | orchestrator | 21:32:03.748 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[2] will be created 2025-03-22 21:32:03.750734 | orchestrator | 21:32:03.748 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-22 21:32:03.750738 | orchestrator | 21:32:03.748 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.750744 | orchestrator | 21:32:03.748 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.750749 | orchestrator | 21:32:03.748 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750753 | orchestrator | 21:32:03.748 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.750758 | orchestrator | 21:32:03.748 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.750763 | orchestrator | 21:32:03.748 STDOUT terraform:  + name = "testbed-volume-2-node-base" 2025-03-22 21:32:03.750768 | orchestrator | 21:32:03.748 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.750773 | orchestrator | 21:32:03.749 STDOUT terraform:  + size = 80 2025-03-22 21:32:03.750778 | orchestrator | 21:32:03.749 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.750783 | orchestrator | 21:32:03.749 STDOUT terraform:  } 2025-03-22 21:32:03.750787 | orchestrator | 21:32:03.749 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[3] will be created 2025-03-22 21:32:03.750794 | orchestrator | 21:32:03.749 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-22 21:32:03.750799 | orchestrator | 21:32:03.749 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.750804 | orchestrator | 21:32:03.749 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.750809 | orchestrator | 21:32:03.749 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.750814 | orchestrator | 21:32:03.749 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.750821 | orchestrator | 21:32:03.749 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.751093 | orchestrator | 21:32:03.749 STDOUT terraform:  + name = "testbed-volume-3-node-base" 2025-03-22 21:32:03.751101 | orchestrator | 21:32:03.749 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.751106 | orchestrator | 21:32:03.749 STDOUT terraform:  + size = 80 2025-03-22 21:32:03.751111 | orchestrator | 21:32:03.749 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.751116 | orchestrator | 21:32:03.749 STDOUT terraform:  } 2025-03-22 21:32:03.751122 | orchestrator | 21:32:03.749 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[4] will be created 2025-03-22 21:32:03.751130 | orchestrator | 21:32:03.749 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-22 21:32:03.751135 | orchestrator | 21:32:03.749 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.751145 | orchestrator | 21:32:03.749 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.751150 | orchestrator | 21:32:03.749 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.751166 | orchestrator | 21:32:03.749 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.751171 | orchestrator | 21:32:03.749 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.751176 | orchestrator | 21:32:03.749 STDOUT terraform:  + name = "testbed-volume-4-node-base" 2025-03-22 21:32:03.751181 | orchestrator | 21:32:03.750 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.751186 | orchestrator | 21:32:03.750 STDOUT terraform:  + size = 80 2025-03-22 21:32:03.751191 | orchestrator | 21:32:03.750 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.751196 | orchestrator | 21:32:03.750 STDOUT terraform:  } 2025-03-22 21:32:03.751201 | orchestrator | 21:32:03.750 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[5] will be created 2025-03-22 21:32:03.751206 | orchestrator | 21:32:03.750 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-03-22 21:32:03.751210 | orchestrator | 21:32:03.750 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.751215 | orchestrator | 21:32:03.750 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.751220 | orchestrator | 21:32:03.750 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.751234 | orchestrator | 21:32:03.750 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.751239 | orchestrator | 21:32:03.750 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.751244 | orchestrator | 21:32:03.750 STDOUT terraform:  + name = "testbed-volume-5-node-base" 2025-03-22 21:32:03.751249 | orchestrator | 21:32:03.750 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.751254 | orchestrator | 21:32:03.750 STDOUT terraform:  + size = 80 2025-03-22 21:32:03.751259 | orchestrator | 21:32:03.750 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.751264 | orchestrator | 21:32:03.750 STDOUT terraform:  } 2025-03-22 21:32:03.751269 | orchestrator | 21:32:03.750 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[0] will be created 2025-03-22 21:32:03.751273 | orchestrator | 21:32:03.750 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.751281 | orchestrator | 21:32:03.750 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.751804 | orchestrator | 21:32:03.750 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.751821 | orchestrator | 21:32:03.750 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.751827 | orchestrator | 21:32:03.750 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.751832 | orchestrator | 21:32:03.750 STDOUT terraform:  + name = "testbed-volume-0-node-0" 2025-03-22 21:32:03.751838 | orchestrator | 21:32:03.751 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.751844 | orchestrator | 21:32:03.751 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.751854 | orchestrator | 21:32:03.751 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.751870 | orchestrator | 21:32:03.751 STDOUT terraform:  } 2025-03-22 21:32:03.751875 | orchestrator | 21:32:03.751 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[1] will be created 2025-03-22 21:32:03.751884 | orchestrator | 21:32:03.751 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.752529 | orchestrator | 21:32:03.751 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.752577 | orchestrator | 21:32:03.751 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.752585 | orchestrator | 21:32:03.751 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.752591 | orchestrator | 21:32:03.751 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.752597 | orchestrator | 21:32:03.751 STDOUT terraform:  + name = "testbed-volume-1-node-1" 2025-03-22 21:32:03.752603 | orchestrator | 21:32:03.751 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.752610 | orchestrator | 21:32:03.751 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.752616 | orchestrator | 21:32:03.751 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.752622 | orchestrator | 21:32:03.751 STDOUT terraform:  } 2025-03-22 21:32:03.752628 | orchestrator | 21:32:03.751 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[2] will be created 2025-03-22 21:32:03.752647 | orchestrator | 21:32:03.751 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.752653 | orchestrator | 21:32:03.751 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.752659 | orchestrator | 21:32:03.751 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.752665 | orchestrator | 21:32:03.751 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.752680 | orchestrator | 21:32:03.751 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.752733 | orchestrator | 21:32:03.751 STDOUT terraform:  + name = "testbed-volume-2-node-2" 2025-03-22 21:32:03.752740 | orchestrator | 21:32:03.751 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.752747 | orchestrator | 21:32:03.752 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.752753 | orchestrator | 21:32:03.752 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.752759 | orchestrator | 21:32:03.752 STDOUT terraform:  } 2025-03-22 21:32:03.752765 | orchestrator | 21:32:03.752 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[3] will be created 2025-03-22 21:32:03.752771 | orchestrator | 21:32:03.752 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.752777 | orchestrator | 21:32:03.752 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.752795 | orchestrator | 21:32:03.752 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.752801 | orchestrator | 21:32:03.752 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.752807 | orchestrator | 21:32:03.752 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.752825 | orchestrator | 21:32:03.752 STDOUT terraform:  + name = "testbed-volume-3-node-3" 2025-03-22 21:32:03.752831 | orchestrator | 21:32:03.752 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.752837 | orchestrator | 21:32:03.752 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.752854 | orchestrator | 21:32:03.752 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.752861 | orchestrator | 21:32:03.752 STDOUT terraform:  } 2025-03-22 21:32:03.752871 | orchestrator | 21:32:03.752 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[4] will be created 2025-03-22 21:32:03.752877 | orchestrator | 21:32:03.752 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.752886 | orchestrator | 21:32:03.752 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.752893 | orchestrator | 21:32:03.752 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.752899 | orchestrator | 21:32:03.752 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.752905 | orchestrator | 21:32:03.752 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.752913 | orchestrator | 21:32:03.752 STDOUT terraform:  + name = "testbed-volume-4-node-4" 2025-03-22 21:32:03.752948 | orchestrator | 21:32:03.752 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.752958 | orchestrator | 21:32:03.752 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.753002 | orchestrator | 21:32:03.752 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.753064 | orchestrator | 21:32:03.752 STDOUT terraform:  } 2025-03-22 21:32:03.753073 | orchestrator | 21:32:03.752 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[5] will be created 2025-03-22 21:32:03.753129 | orchestrator | 21:32:03.753 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.753171 | orchestrator | 21:32:03.753 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.753199 | orchestrator | 21:32:03.753 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.753260 | orchestrator | 21:32:03.753 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.753292 | orchestrator | 21:32:03.753 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.753347 | orchestrator | 21:32:03.753 STDOUT terraform:  + name = "testbed-volume-5-node-5" 2025-03-22 21:32:03.753389 | orchestrator | 21:32:03.753 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.753418 | orchestrator | 21:32:03.753 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.753457 | orchestrator | 21:32:03.753 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.753520 | orchestrator | 21:32:03.753 STDOUT terraform:  } 2025-03-22 21:32:03.753529 | orchestrator | 21:32:03.753 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[6] will be created 2025-03-22 21:32:03.757982 | orchestrator | 21:32:03.753 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758006 | orchestrator | 21:32:03.753 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758040 | orchestrator | 21:32:03.753 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758047 | orchestrator | 21:32:03.753 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758053 | orchestrator | 21:32:03.753 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758058 | orchestrator | 21:32:03.753 STDOUT terraform:  + name = "testbed-volume-6-node-0" 2025-03-22 21:32:03.758063 | orchestrator | 21:32:03.753 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758069 | orchestrator | 21:32:03.753 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758074 | orchestrator | 21:32:03.753 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758080 | orchestrator | 21:32:03.753 STDOUT terraform:  } 2025-03-22 21:32:03.758086 | orchestrator | 21:32:03.753 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[7] will be created 2025-03-22 21:32:03.758091 | orchestrator | 21:32:03.753 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758097 | orchestrator | 21:32:03.754 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758102 | orchestrator | 21:32:03.754 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758107 | orchestrator | 21:32:03.754 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758113 | orchestrator | 21:32:03.754 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758118 | orchestrator | 21:32:03.754 STDOUT terraform:  + name = "testbed-volume-7-node-1" 2025-03-22 21:32:03.758124 | orchestrator | 21:32:03.754 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758132 | orchestrator | 21:32:03.754 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758138 | orchestrator | 21:32:03.754 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758144 | orchestrator | 21:32:03.754 STDOUT terraform:  } 2025-03-22 21:32:03.758149 | orchestrator | 21:32:03.754 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[8] will be created 2025-03-22 21:32:03.758154 | orchestrator | 21:32:03.754 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758160 | orchestrator | 21:32:03.754 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758169 | orchestrator | 21:32:03.754 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758175 | orchestrator | 21:32:03.754 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758180 | orchestrator | 21:32:03.754 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758185 | orchestrator | 21:32:03.754 STDOUT terraform:  + name = "testbed-volume-8-node-2" 2025-03-22 21:32:03.758191 | orchestrator | 21:32:03.754 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758196 | orchestrator | 21:32:03.754 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758202 | orchestrator | 21:32:03.754 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758207 | orchestrator | 21:32:03.754 STDOUT terraform:  } 2025-03-22 21:32:03.758216 | orchestrator | 21:32:03.754 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[9] will be created 2025-03-22 21:32:03.758222 | orchestrator | 21:32:03.754 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758227 | orchestrator | 21:32:03.754 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758232 | orchestrator | 21:32:03.754 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758238 | orchestrator | 21:32:03.754 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758249 | orchestrator | 21:32:03.755 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758254 | orchestrator | 21:32:03.755 STDOUT terraform:  + name = "testbed-volume-9-node-3" 2025-03-22 21:32:03.758260 | orchestrator | 21:32:03.755 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758265 | orchestrator | 21:32:03.755 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758270 | orchestrator | 21:32:03.755 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758276 | orchestrator | 21:32:03.755 STDOUT terraform:  } 2025-03-22 21:32:03.758281 | orchestrator | 21:32:03.755 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[10] will be created 2025-03-22 21:32:03.758286 | orchestrator | 21:32:03.755 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758292 | orchestrator | 21:32:03.755 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758297 | orchestrator | 21:32:03.755 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758303 | orchestrator | 21:32:03.755 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758309 | orchestrator | 21:32:03.755 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758315 | orchestrator | 21:32:03.755 STDOUT terraform:  + name = "testbed-volume-10-node-4" 2025-03-22 21:32:03.758320 | orchestrator | 21:32:03.755 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758338 | orchestrator | 21:32:03.755 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758344 | orchestrator | 21:32:03.755 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758350 | orchestrator | 21:32:03.755 STDOUT terraform:  } 2025-03-22 21:32:03.758355 | orchestrator | 21:32:03.755 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[11] will be created 2025-03-22 21:32:03.758361 | orchestrator | 21:32:03.755 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758366 | orchestrator | 21:32:03.755 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758371 | orchestrator | 21:32:03.755 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758376 | orchestrator | 21:32:03.755 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758381 | orchestrator | 21:32:03.755 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758387 | orchestrator | 21:32:03.755 STDOUT terraform:  + name = "testbed-volume-11-node-5" 2025-03-22 21:32:03.758396 | orchestrator | 21:32:03.755 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758401 | orchestrator | 21:32:03.755 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758407 | orchestrator | 21:32:03.755 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758412 | orchestrator | 21:32:03.755 STDOUT terraform:  } 2025-03-22 21:32:03.758417 | orchestrator | 21:32:03.755 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[12] will be created 2025-03-22 21:32:03.758422 | orchestrator | 21:32:03.755 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758428 | orchestrator | 21:32:03.756 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758433 | orchestrator | 21:32:03.756 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758438 | orchestrator | 21:32:03.756 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758444 | orchestrator | 21:32:03.756 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758449 | orchestrator | 21:32:03.756 STDOUT terraform:  + name = "testbed-volume-12-node-0" 2025-03-22 21:32:03.758454 | orchestrator | 21:32:03.756 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758460 | orchestrator | 21:32:03.756 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758465 | orchestrator | 21:32:03.756 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758470 | orchestrator | 21:32:03.756 STDOUT terraform:  } 2025-03-22 21:32:03.758479 | orchestrator | 21:32:03.756 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[13] will be created 2025-03-22 21:32:03.758485 | orchestrator | 21:32:03.756 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758490 | orchestrator | 21:32:03.756 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758496 | orchestrator | 21:32:03.756 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758501 | orchestrator | 21:32:03.756 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758506 | orchestrator | 21:32:03.756 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758511 | orchestrator | 21:32:03.756 STDOUT terraform:  + name = "testbed-volume-13-node-1" 2025-03-22 21:32:03.758517 | orchestrator | 21:32:03.756 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758522 | orchestrator | 21:32:03.756 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758528 | orchestrator | 21:32:03.756 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758533 | orchestrator | 21:32:03.756 STDOUT terraform:  } 2025-03-22 21:32:03.758539 | orchestrator | 21:32:03.756 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[14] will be created 2025-03-22 21:32:03.758544 | orchestrator | 21:32:03.756 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758549 | orchestrator | 21:32:03.756 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758555 | orchestrator | 21:32:03.756 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758567 | orchestrator | 21:32:03.756 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758572 | orchestrator | 21:32:03.756 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758578 | orchestrator | 21:32:03.756 STDOUT terraform:  + name = "testbed-volume-14-node-2" 2025-03-22 21:32:03.758583 | orchestrator | 21:32:03.756 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758588 | orchestrator | 21:32:03.756 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758593 | orchestrator | 21:32:03.756 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758599 | orchestrator | 21:32:03.756 STDOUT terraform:  } 2025-03-22 21:32:03.758604 | orchestrator | 21:32:03.756 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[15] will be created 2025-03-22 21:32:03.758610 | orchestrator | 21:32:03.757 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758615 | orchestrator | 21:32:03.757 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758620 | orchestrator | 21:32:03.757 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758626 | orchestrator | 21:32:03.757 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758632 | orchestrator | 21:32:03.757 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758638 | orchestrator | 21:32:03.757 STDOUT terraform:  + name = "testbed-volume-15-node-3" 2025-03-22 21:32:03.758644 | orchestrator | 21:32:03.757 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758659 | orchestrator | 21:32:03.757 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758665 | orchestrator | 21:32:03.757 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758670 | orchestrator | 21:32:03.757 STDOUT terraform:  } 2025-03-22 21:32:03.758676 | orchestrator | 21:32:03.757 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[16] will be created 2025-03-22 21:32:03.758682 | orchestrator | 21:32:03.757 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758687 | orchestrator | 21:32:03.757 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758692 | orchestrator | 21:32:03.757 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758701 | orchestrator | 21:32:03.757 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758801 | orchestrator | 21:32:03.757 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758809 | orchestrator | 21:32:03.757 STDOUT terraform:  + name = "testbed-volume-16-node-4" 2025-03-22 21:32:03.758814 | orchestrator | 21:32:03.757 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758820 | orchestrator | 21:32:03.757 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758828 | orchestrator | 21:32:03.757 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758834 | orchestrator | 21:32:03.757 STDOUT terraform:  } 2025-03-22 21:32:03.758839 | orchestrator | 21:32:03.757 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[17] will be created 2025-03-22 21:32:03.758848 | orchestrator | 21:32:03.757 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-03-22 21:32:03.758853 | orchestrator | 21:32:03.757 STDOUT terraform:  + attachment = (known after apply) 2025-03-22 21:32:03.758859 | orchestrator | 21:32:03.757 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758864 | orchestrator | 21:32:03.757 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758870 | orchestrator | 21:32:03.757 STDOUT terraform:  + metadata = (known after apply) 2025-03-22 21:32:03.758875 | orchestrator | 21:32:03.757 STDOUT terraform:  + name = "testbed-volume-17-node-5" 2025-03-22 21:32:03.758880 | orchestrator | 21:32:03.757 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.758885 | orchestrator | 21:32:03.757 STDOUT terraform:  + size = 20 2025-03-22 21:32:03.758891 | orchestrator | 21:32:03.757 STDOUT terraform:  + volume_type = "ssd" 2025-03-22 21:32:03.758896 | orchestrator | 21:32:03.757 STDOUT terraform:  } 2025-03-22 21:32:03.758903 | orchestrator | 21:32:03.758 STDOUT terraform:  # openstack_compute_instance_v2.manager_server will be created 2025-03-22 21:32:03.758908 | orchestrator | 21:32:03.758 STDOUT terraform:  + resource "openstack_compute_instance_v2" "manager_server" { 2025-03-22 21:32:03.758914 | orchestrator | 21:32:03.758 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-22 21:32:03.758919 | orchestrator | 21:32:03.758 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-22 21:32:03.758924 | orchestrator | 21:32:03.758 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-22 21:32:03.758930 | orchestrator | 21:32:03.758 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.758935 | orchestrator | 21:32:03.758 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.758953 | orchestrator | 21:32:03.758 STDOUT terraform:  + config_drive = true 2025-03-22 21:32:03.758958 | orchestrator | 21:32:03.758 STDOUT terraform:  + created = (known after apply) 2025-03-22 21:32:03.758964 | orchestrator | 21:32:03.758 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-22 21:32:03.758969 | orchestrator | 21:32:03.758 STDOUT terraform:  + flavor_name = "OSISM-4V-16" 2025-03-22 21:32:03.758975 | orchestrator | 21:32:03.758 STDOUT terraform:  + force_delete = false 2025-03-22 21:32:03.758980 | orchestrator | 21:32:03.758 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.758986 | orchestrator | 21:32:03.758 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.758991 | orchestrator | 21:32:03.758 STDOUT terraform:  + image_name = (known after apply) 2025-03-22 21:32:03.758997 | orchestrator | 21:32:03.758 STDOUT terraform:  + key_pair = "testbed" 2025-03-22 21:32:03.759003 | orchestrator | 21:32:03.758 STDOUT terraform:  + name = "testbed-manager" 2025-03-22 21:32:03.759008 | orchestrator | 21:32:03.758 STDOUT terraform:  + power_state = "active" 2025-03-22 21:32:03.759016 | orchestrator | 21:32:03.758 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.760065 | orchestrator | 21:32:03.758 STDOUT terraform:  + security_groups = (known after apply) 2025-03-22 21:32:03.760085 | orchestrator | 21:32:03.758 STDOUT terraform:  + stop_before_destroy = false 2025-03-22 21:32:03.760090 | orchestrator | 21:32:03.758 STDOUT terraform:  + updated = (known after apply) 2025-03-22 21:32:03.760097 | orchestrator | 21:32:03.758 STDOUT terraform:  + user_data = (known after apply) 2025-03-22 21:32:03.760102 | orchestrator | 21:32:03.758 STDOUT terraform:  + block_device { 2025-03-22 21:32:03.760107 | orchestrator | 21:32:03.758 STDOUT terraform:  + boot_index = 0 2025-03-22 21:32:03.760112 | orchestrator | 21:32:03.758 STDOUT terraform:  + delete_on_termination = false 2025-03-22 21:32:03.760116 | orchestrator | 21:32:03.758 STDOUT terraform:  + destination_type = "volume" 2025-03-22 21:32:03.760121 | orchestrator | 21:32:03.758 STDOUT terraform:  + multiattach = false 2025-03-22 21:32:03.760129 | orchestrator | 21:32:03.758 STDOUT terraform:  + source_type = "volume" 2025-03-22 21:32:03.771239 | orchestrator | 21:32:03.759 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771282 | orchestrator | 21:32:03.759 STDOUT terraform:  } 2025-03-22 21:32:03.771289 | orchestrator | 21:32:03.766 STDOUT terraform:  + network { 2025-03-22 21:32:03.771294 | orchestrator | 21:32:03.766 STDOUT terraform:  + access_network = false 2025-03-22 21:32:03.771300 | orchestrator | 21:32:03.767 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-22 21:32:03.771306 | orchestrator | 21:32:03.767 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-22 21:32:03.771311 | orchestrator | 21:32:03.767 STDOUT terraform:  + mac = (known after apply) 2025-03-22 21:32:03.771316 | orchestrator | 21:32:03.767 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.771321 | orchestrator | 21:32:03.767 STDOUT terraform:  + port = (known after apply) 2025-03-22 21:32:03.771334 | orchestrator | 21:32:03.767 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771340 | orchestrator | 21:32:03.767 STDOUT terraform:  } 2025-03-22 21:32:03.771345 | orchestrator | 21:32:03.767 STDOUT terraform:  } 2025-03-22 21:32:03.771350 | orchestrator | 21:32:03.767 STDOUT terraform:  # openstack_compute_instance_v2.node_server[0] will be created 2025-03-22 21:32:03.771354 | orchestrator | 21:32:03.767 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-22 21:32:03.771360 | orchestrator | 21:32:03.767 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-22 21:32:03.771365 | orchestrator | 21:32:03.767 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-22 21:32:03.771370 | orchestrator | 21:32:03.767 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-22 21:32:03.771374 | orchestrator | 21:32:03.767 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.771379 | orchestrator | 21:32:03.767 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.771384 | orchestrator | 21:32:03.767 STDOUT terraform:  + config_drive = true 2025-03-22 21:32:03.771389 | orchestrator | 21:32:03.767 STDOUT terraform:  + created = (known after apply) 2025-03-22 21:32:03.771403 | orchestrator | 21:32:03.767 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-22 21:32:03.771408 | orchestrator | 21:32:03.767 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-22 21:32:03.771413 | orchestrator | 21:32:03.767 STDOUT terraform:  + force_delete = false 2025-03-22 21:32:03.771417 | orchestrator | 21:32:03.767 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.771422 | orchestrator | 21:32:03.767 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.771427 | orchestrator | 21:32:03.767 STDOUT terraform:  + image_name = (known after apply) 2025-03-22 21:32:03.771432 | orchestrator | 21:32:03.767 STDOUT terraform:  + key_pair = "testbed" 2025-03-22 21:32:03.771437 | orchestrator | 21:32:03.767 STDOUT terraform:  + name = "testbed-node-0" 2025-03-22 21:32:03.771442 | orchestrator | 21:32:03.767 STDOUT terraform:  + power_state = "active" 2025-03-22 21:32:03.771447 | orchestrator | 21:32:03.767 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.771451 | orchestrator | 21:32:03.767 STDOUT terraform:  + security_groups = (known after apply) 2025-03-22 21:32:03.771456 | orchestrator | 21:32:03.767 STDOUT terraform:  + stop_before_destroy = false 2025-03-22 21:32:03.771461 | orchestrator | 21:32:03.767 STDOUT terraform:  + updated = (known after apply) 2025-03-22 21:32:03.771466 | orchestrator | 21:32:03.767 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-22 21:32:03.771471 | orchestrator | 21:32:03.767 STDOUT terraform:  + block_device { 2025-03-22 21:32:03.771476 | orchestrator | 21:32:03.767 STDOUT terraform:  + boot_index = 0 2025-03-22 21:32:03.771483 | orchestrator | 21:32:03.768 STDOUT terraform:  + delete_on_termination = false 2025-03-22 21:32:03.771488 | orchestrator | 21:32:03.768 STDOUT terraform:  + destination_type = "volume" 2025-03-22 21:32:03.771497 | orchestrator | 21:32:03.768 STDOUT terraform:  + multiattach = false 2025-03-22 21:32:03.771502 | orchestrator | 21:32:03.768 STDOUT terraform:  + source_type = "volume" 2025-03-22 21:32:03.771507 | orchestrator | 21:32:03.768 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771512 | orchestrator | 21:32:03.768 STDOUT terraform:  } 2025-03-22 21:32:03.771517 | orchestrator | 21:32:03.768 STDOUT terraform:  + network { 2025-03-22 21:32:03.771521 | orchestrator | 21:32:03.768 STDOUT terraform:  + access_network = false 2025-03-22 21:32:03.771527 | orchestrator | 21:32:03.768 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-22 21:32:03.771532 | orchestrator | 21:32:03.768 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-22 21:32:03.771537 | orchestrator | 21:32:03.768 STDOUT terraform:  + mac = (known after apply) 2025-03-22 21:32:03.771541 | orchestrator | 21:32:03.768 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.771549 | orchestrator | 21:32:03.768 STDOUT terraform:  + port = (known after apply) 2025-03-22 21:32:03.771554 | orchestrator | 21:32:03.768 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771562 | orchestrator | 21:32:03.768 STDOUT terraform:  } 2025-03-22 21:32:03.771567 | orchestrator | 21:32:03.768 STDOUT terraform:  } 2025-03-22 21:32:03.771572 | orchestrator | 21:32:03.768 STDOUT terraform:  # openstack_compute_instance_v2.node_server[1] will be created 2025-03-22 21:32:03.771577 | orchestrator | 21:32:03.768 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-22 21:32:03.771583 | orchestrator | 21:32:03.768 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-22 21:32:03.771588 | orchestrator | 21:32:03.768 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-22 21:32:03.771593 | orchestrator | 21:32:03.768 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-22 21:32:03.771598 | orchestrator | 21:32:03.768 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.771603 | orchestrator | 21:32:03.768 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.771608 | orchestrator | 21:32:03.768 STDOUT terraform:  + config_drive = true 2025-03-22 21:32:03.771613 | orchestrator | 21:32:03.768 STDOUT terraform:  + created = (known after apply) 2025-03-22 21:32:03.771618 | orchestrator | 21:32:03.768 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-22 21:32:03.771625 | orchestrator | 21:32:03.768 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-22 21:32:03.771630 | orchestrator | 21:32:03.768 STDOUT terraform:  + force_delete = false 2025-03-22 21:32:03.771635 | orchestrator | 21:32:03.768 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.771640 | orchestrator | 21:32:03.768 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.771645 | orchestrator | 21:32:03.768 STDOUT terraform:  + image_name = (known after apply) 2025-03-22 21:32:03.771650 | orchestrator | 21:32:03.768 STDOUT terraform:  + key_pair = "testbed" 2025-03-22 21:32:03.771655 | orchestrator | 21:32:03.768 STDOUT terraform:  + name = "testbed-node-1" 2025-03-22 21:32:03.771659 | orchestrator | 21:32:03.768 STDOUT terraform:  + power_state = "active" 2025-03-22 21:32:03.771664 | orchestrator | 21:32:03.768 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.771669 | orchestrator | 21:32:03.768 STDOUT terraform:  + security_groups = (known after apply) 2025-03-22 21:32:03.771674 | orchestrator | 21:32:03.769 STDOUT terraform:  + stop_before_destroy = false 2025-03-22 21:32:03.771679 | orchestrator | 21:32:03.769 STDOUT terraform:  + updated = (known after apply) 2025-03-22 21:32:03.771684 | orchestrator | 21:32:03.769 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-22 21:32:03.771689 | orchestrator | 21:32:03.769 STDOUT terraform:  + block_device { 2025-03-22 21:32:03.771697 | orchestrator | 21:32:03.769 STDOUT terraform:  + boot_index = 0 2025-03-22 21:32:03.771703 | orchestrator | 21:32:03.769 STDOUT terraform:  + delete_on_termination = false 2025-03-22 21:32:03.771707 | orchestrator | 21:32:03.769 STDOUT terraform:  + destination_type = "volume" 2025-03-22 21:32:03.771712 | orchestrator | 21:32:03.769 STDOUT terraform:  + multiattach = false 2025-03-22 21:32:03.771720 | orchestrator | 21:32:03.769 STDOUT terraform:  + source_type = "volume" 2025-03-22 21:32:03.771725 | orchestrator | 21:32:03.769 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771730 | orchestrator | 21:32:03.769 STDOUT terraform:  } 2025-03-22 21:32:03.771735 | orchestrator | 21:32:03.769 STDOUT terraform:  + network { 2025-03-22 21:32:03.771740 | orchestrator | 21:32:03.769 STDOUT terraform:  + access_network = false 2025-03-22 21:32:03.771745 | orchestrator | 21:32:03.769 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-22 21:32:03.771749 | orchestrator | 21:32:03.769 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-22 21:32:03.771754 | orchestrator | 21:32:03.769 STDOUT terraform:  + mac = (known after apply) 2025-03-22 21:32:03.771759 | orchestrator | 21:32:03.769 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.771764 | orchestrator | 21:32:03.769 STDOUT terraform:  + port = (known after apply) 2025-03-22 21:32:03.771769 | orchestrator | 21:32:03.769 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771773 | orchestrator | 21:32:03.769 STDOUT terraform:  } 2025-03-22 21:32:03.771779 | orchestrator | 21:32:03.769 STDOUT terraform:  } 2025-03-22 21:32:03.771783 | orchestrator | 21:32:03.769 STDOUT terraform:  # openstack_compute_instance_v2.node_server[2] will be created 2025-03-22 21:32:03.771788 | orchestrator | 21:32:03.769 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-22 21:32:03.771793 | orchestrator | 21:32:03.769 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-22 21:32:03.771798 | orchestrator | 21:32:03.769 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-22 21:32:03.771805 | orchestrator | 21:32:03.769 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-22 21:32:03.771810 | orchestrator | 21:32:03.769 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.771815 | orchestrator | 21:32:03.769 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.771820 | orchestrator | 21:32:03.769 STDOUT terraform:  + config_drive = true 2025-03-22 21:32:03.771825 | orchestrator | 21:32:03.769 STDOUT terraform:  + created = (known after apply) 2025-03-22 21:32:03.771830 | orchestrator | 21:32:03.769 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-22 21:32:03.771834 | orchestrator | 21:32:03.769 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-22 21:32:03.771839 | orchestrator | 21:32:03.769 STDOUT terraform:  + force_delete = false 2025-03-22 21:32:03.771844 | orchestrator | 21:32:03.769 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.771849 | orchestrator | 21:32:03.769 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.771854 | orchestrator | 21:32:03.769 STDOUT terraform:  + image_name = (known after apply) 2025-03-22 21:32:03.771858 | orchestrator | 21:32:03.770 STDOUT terraform:  + key_pair = "testbed" 2025-03-22 21:32:03.771863 | orchestrator | 21:32:03.770 STDOUT terraform:  + name = "testbed-node-2" 2025-03-22 21:32:03.771871 | orchestrator | 21:32:03.770 STDOUT terraform:  + power_state = "active" 2025-03-22 21:32:03.771876 | orchestrator | 21:32:03.770 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.771880 | orchestrator | 21:32:03.770 STDOUT terraform:  + security_groups = (known after apply) 2025-03-22 21:32:03.771885 | orchestrator | 21:32:03.770 STDOUT terraform:  + stop_before_destroy = false 2025-03-22 21:32:03.771892 | orchestrator | 21:32:03.770 STDOUT terraform:  + updated = (known after apply) 2025-03-22 21:32:03.771898 | orchestrator | 21:32:03.770 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-22 21:32:03.771903 | orchestrator | 21:32:03.770 STDOUT terraform:  + block_device { 2025-03-22 21:32:03.771908 | orchestrator | 21:32:03.770 STDOUT terraform:  + boot_index = 0 2025-03-22 21:32:03.771912 | orchestrator | 21:32:03.770 STDOUT terraform:  + delete_on_termination = false 2025-03-22 21:32:03.771917 | orchestrator | 21:32:03.770 STDOUT terraform:  + destination_type = "volume" 2025-03-22 21:32:03.771922 | orchestrator | 21:32:03.770 STDOUT terraform:  + multiattach = false 2025-03-22 21:32:03.771927 | orchestrator | 21:32:03.770 STDOUT terraform:  + source_type = "volume" 2025-03-22 21:32:03.771932 | orchestrator | 21:32:03.770 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771937 | orchestrator | 21:32:03.770 STDOUT terraform:  } 2025-03-22 21:32:03.771941 | orchestrator | 21:32:03.770 STDOUT terraform:  + network { 2025-03-22 21:32:03.771946 | orchestrator | 21:32:03.770 STDOUT terraform:  + access_network = false 2025-03-22 21:32:03.771951 | orchestrator | 21:32:03.770 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-22 21:32:03.771956 | orchestrator | 21:32:03.770 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-22 21:32:03.771961 | orchestrator | 21:32:03.770 STDOUT terraform:  + mac = (known after apply) 2025-03-22 21:32:03.771966 | orchestrator | 21:32:03.770 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.771970 | orchestrator | 21:32:03.770 STDOUT terraform:  + port = (known after apply) 2025-03-22 21:32:03.771975 | orchestrator | 21:32:03.770 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.771980 | orchestrator | 21:32:03.770 STDOUT terraform:  } 2025-03-22 21:32:03.771985 | orchestrator | 21:32:03.770 STDOUT terraform:  } 2025-03-22 21:32:03.771990 | orchestrator | 21:32:03.770 STDOUT terraform:  # openstack_compute_instance_v2.node_server[3] will be created 2025-03-22 21:32:03.771995 | orchestrator | 21:32:03.770 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-22 21:32:03.771999 | orchestrator | 21:32:03.770 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-22 21:32:03.772004 | orchestrator | 21:32:03.770 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-22 21:32:03.772009 | orchestrator | 21:32:03.770 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-22 21:32:03.772014 | orchestrator | 21:32:03.770 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.772025 | orchestrator | 21:32:03.770 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.772030 | orchestrator | 21:32:03.770 STDOUT terraform:  + config_drive = true 2025-03-22 21:32:03.772035 | orchestrator | 21:32:03.770 STDOUT terraform:  + created = (known after apply) 2025-03-22 21:32:03.772040 | orchestrator | 21:32:03.771 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-22 21:32:03.772047 | orchestrator | 21:32:03.771 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-22 21:32:03.772052 | orchestrator | 21:32:03.771 STDOUT terraform:  + force_delete = false 2025-03-22 21:32:03.772057 | orchestrator | 21:32:03.771 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.772064 | orchestrator | 21:32:03.771 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.772068 | orchestrator | 21:32:03.771 STDOUT terraform:  + image_name = (known after apply) 2025-03-22 21:32:03.772073 | orchestrator | 21:32:03.771 STDOUT terraform:  + key_pair = "testbed" 2025-03-22 21:32:03.772078 | orchestrator | 21:32:03.771 STDOUT terraform:  + name = "testbed-node-3" 2025-03-22 21:32:03.772083 | orchestrator | 21:32:03.771 STDOUT terraform:  + power_state = "active" 2025-03-22 21:32:03.772090 | orchestrator | 21:32:03.771 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778043 | orchestrator | 21:32:03.771 STDOUT terraform:  + security_groups = (known after apply) 2025-03-22 21:32:03.778064 | orchestrator | 21:32:03.771 STDOUT terraform:  + stop_before_destroy = false 2025-03-22 21:32:03.778069 | orchestrator | 21:32:03.771 STDOUT terraform:  + updated = (known after apply) 2025-03-22 21:32:03.778074 | orchestrator | 21:32:03.771 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-22 21:32:03.778080 | orchestrator | 21:32:03.771 STDOUT terraform:  + block_device { 2025-03-22 21:32:03.778085 | orchestrator | 21:32:03.771 STDOUT terraform:  + boot_index = 0 2025-03-22 21:32:03.778091 | orchestrator | 21:32:03.771 STDOUT terraform:  + delete_on_termination = false 2025-03-22 21:32:03.778096 | orchestrator | 21:32:03.771 STDOUT terraform:  + destination_type = "volume" 2025-03-22 21:32:03.778101 | orchestrator | 21:32:03.771 STDOUT terraform:  + multiattach = false 2025-03-22 21:32:03.778105 | orchestrator | 21:32:03.771 STDOUT terraform:  + source_type = "volume" 2025-03-22 21:32:03.778111 | orchestrator | 21:32:03.771 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.778116 | orchestrator | 21:32:03.771 STDOUT terraform:  } 2025-03-22 21:32:03.778121 | orchestrator | 21:32:03.771 STDOUT terraform:  + network { 2025-03-22 21:32:03.778126 | orchestrator | 21:32:03.771 STDOUT terraform:  + access_network = false 2025-03-22 21:32:03.778130 | orchestrator | 21:32:03.771 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-22 21:32:03.778136 | orchestrator | 21:32:03.771 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-22 21:32:03.778140 | orchestrator | 21:32:03.771 STDOUT terraform:  + mac = (known after apply) 2025-03-22 21:32:03.778153 | orchestrator | 21:32:03.771 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.778158 | orchestrator | 21:32:03.771 STDOUT terraform:  + port = (known after apply) 2025-03-22 21:32:03.778163 | orchestrator | 21:32:03.771 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.778168 | orchestrator | 21:32:03.771 STDOUT terraform:  } 2025-03-22 21:32:03.778173 | orchestrator | 21:32:03.771 STDOUT terraform:  } 2025-03-22 21:32:03.778178 | orchestrator | 21:32:03.771 STDOUT terraform:  # openstack_compute_instance_v2.node_server[4] will be created 2025-03-22 21:32:03.778182 | orchestrator | 21:32:03.771 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-22 21:32:03.778187 | orchestrator | 21:32:03.771 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-22 21:32:03.778192 | orchestrator | 21:32:03.771 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-22 21:32:03.778197 | orchestrator | 21:32:03.771 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-22 21:32:03.778202 | orchestrator | 21:32:03.771 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.778206 | orchestrator | 21:32:03.772 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.778211 | orchestrator | 21:32:03.772 STDOUT terraform:  + config_drive = true 2025-03-22 21:32:03.778216 | orchestrator | 21:32:03.772 STDOUT terraform:  + created = (known after apply) 2025-03-22 21:32:03.778226 | orchestrator | 21:32:03.772 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-22 21:32:03.778231 | orchestrator | 21:32:03.772 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-22 21:32:03.778236 | orchestrator | 21:32:03.772 STDOUT terraform:  + force_delete = false 2025-03-22 21:32:03.778241 | orchestrator | 21:32:03.772 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778245 | orchestrator | 21:32:03.772 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.778250 | orchestrator | 21:32:03.772 STDOUT terraform:  + image_name = (known after apply) 2025-03-22 21:32:03.778255 | orchestrator | 21:32:03.772 STDOUT terraform:  + key_pair = "testbed" 2025-03-22 21:32:03.778260 | orchestrator | 21:32:03.772 STDOUT terraform:  + name = "testbed-node-4" 2025-03-22 21:32:03.778265 | orchestrator | 21:32:03.772 STDOUT terraform:  + power_state = "active" 2025-03-22 21:32:03.778270 | orchestrator | 21:32:03.772 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778275 | orchestrator | 21:32:03.772 STDOUT terraform:  + security_groups = (known after apply) 2025-03-22 21:32:03.778280 | orchestrator | 21:32:03.772 STDOUT terraform:  + stop_before_destroy = false 2025-03-22 21:32:03.778285 | orchestrator | 21:32:03.772 STDOUT terraform:  + updated = (known after apply) 2025-03-22 21:32:03.778289 | orchestrator | 21:32:03.772 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-22 21:32:03.778294 | orchestrator | 21:32:03.772 STDOUT terraform:  + block_device { 2025-03-22 21:32:03.778299 | orchestrator | 21:32:03.772 STDOUT terraform:  + boot_index = 0 2025-03-22 21:32:03.778308 | orchestrator | 21:32:03.772 STDOUT terraform:  + delete_on_termination = false 2025-03-22 21:32:03.778312 | orchestrator | 21:32:03.772 STDOUT terraform:  + destination_type = "volume" 2025-03-22 21:32:03.778317 | orchestrator | 21:32:03.772 STDOUT terraform:  + multiattach = false 2025-03-22 21:32:03.778322 | orchestrator | 21:32:03.772 STDOUT terraform:  + source_type = "volume" 2025-03-22 21:32:03.778336 | orchestrator | 21:32:03.772 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.778342 | orchestrator | 21:32:03.772 STDOUT terraform:  } 2025-03-22 21:32:03.778346 | orchestrator | 21:32:03.772 STDOUT terraform:  + network { 2025-03-22 21:32:03.778351 | orchestrator | 21:32:03.772 STDOUT terraform:  + access_network = false 2025-03-22 21:32:03.778359 | orchestrator | 21:32:03.772 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-22 21:32:03.778364 | orchestrator | 21:32:03.772 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-22 21:32:03.778369 | orchestrator | 21:32:03.772 STDOUT terraform:  + mac = (known after apply) 2025-03-22 21:32:03.778374 | orchestrator | 21:32:03.772 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.778379 | orchestrator | 21:32:03.772 STDOUT terraform:  + port = (known after apply) 2025-03-22 21:32:03.778383 | orchestrator | 21:32:03.772 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.778388 | orchestrator | 21:32:03.772 STDOUT terraform:  } 2025-03-22 21:32:03.778393 | orchestrator | 21:32:03.772 STDOUT terraform:  } 2025-03-22 21:32:03.778398 | orchestrator | 21:32:03.772 STDOUT terraform:  # openstack_compute_instance_v2.node_server[5] will be created 2025-03-22 21:32:03.778403 | orchestrator | 21:32:03.772 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-03-22 21:32:03.778408 | orchestrator | 21:32:03.772 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-03-22 21:32:03.778413 | orchestrator | 21:32:03.772 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-03-22 21:32:03.778418 | orchestrator | 21:32:03.772 STDOUT terraform:  + all_metadata = (known after apply) 2025-03-22 21:32:03.778423 | orchestrator | 21:32:03.773 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.778432 | orchestrator | 21:32:03.773 STDOUT terraform:  + availability_zone = "nova" 2025-03-22 21:32:03.778437 | orchestrator | 21:32:03.773 STDOUT terraform:  + config_drive = true 2025-03-22 21:32:03.778442 | orchestrator | 21:32:03.773 STDOUT terraform:  + created = (known after apply) 2025-03-22 21:32:03.778447 | orchestrator | 21:32:03.773 STDOUT terraform:  + flavor_id = (known after apply) 2025-03-22 21:32:03.778454 | orchestrator | 21:32:03.773 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-03-22 21:32:03.778459 | orchestrator | 21:32:03.773 STDOUT terraform:  + force_delete = false 2025-03-22 21:32:03.778464 | orchestrator | 21:32:03.773 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778469 | orchestrator | 21:32:03.773 STDOUT terraform:  + image_id = (known after apply) 2025-03-22 21:32:03.778478 | orchestrator | 21:32:03.773 STDOUT terraform:  + image_name = (known after apply) 2025-03-22 21:32:03.778483 | orchestrator | 21:32:03.773 STDOUT terraform:  + key_pair = "testbed" 2025-03-22 21:32:03.778488 | orchestrator | 21:32:03.773 STDOUT terraform:  + name = "testbed-node-5" 2025-03-22 21:32:03.778493 | orchestrator | 21:32:03.773 STDOUT terraform:  + power_state = "active" 2025-03-22 21:32:03.778497 | orchestrator | 21:32:03.773 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778502 | orchestrator | 21:32:03.773 STDOUT terraform:  + security_groups = (known after apply) 2025-03-22 21:32:03.778507 | orchestrator | 21:32:03.773 STDOUT terraform:  + stop_before_destroy = false 2025-03-22 21:32:03.778512 | orchestrator | 21:32:03.773 STDOUT terraform:  + updated = (known after apply) 2025-03-22 21:32:03.778517 | orchestrator | 21:32:03.773 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-03-22 21:32:03.778521 | orchestrator | 21:32:03.773 STDOUT terraform:  + block_device { 2025-03-22 21:32:03.778526 | orchestrator | 21:32:03.773 STDOUT terraform:  + boot_index = 0 2025-03-22 21:32:03.778532 | orchestrator | 21:32:03.773 STDOUT terraform:  + delete_on_termination = false 2025-03-22 21:32:03.778536 | orchestrator | 21:32:03.773 STDOUT terraform:  + destination_type = "volume" 2025-03-22 21:32:03.778541 | orchestrator | 21:32:03.773 STDOUT terraform:  + multiattach = false 2025-03-22 21:32:03.778546 | orchestrator | 21:32:03.773 STDOUT terraform:  + source_type = "volume" 2025-03-22 21:32:03.778552 | orchestrator | 21:32:03.773 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.778557 | orchestrator | 21:32:03.773 STDOUT terraform:  } 2025-03-22 21:32:03.778562 | orchestrator | 21:32:03.773 STDOUT terraform:  + network { 2025-03-22 21:32:03.778567 | orchestrator | 21:32:03.773 STDOUT terraform:  + access_network = false 2025-03-22 21:32:03.778572 | orchestrator | 21:32:03.773 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-03-22 21:32:03.778577 | orchestrator | 21:32:03.773 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-03-22 21:32:03.778582 | orchestrator | 21:32:03.773 STDOUT terraform:  + mac = (known after apply) 2025-03-22 21:32:03.778587 | orchestrator | 21:32:03.773 STDOUT terraform:  + name = (known after apply) 2025-03-22 21:32:03.778591 | orchestrator | 21:32:03.773 STDOUT terraform:  + port = (known after apply) 2025-03-22 21:32:03.778597 | orchestrator | 21:32:03.773 STDOUT terraform:  + uuid = (known after apply) 2025-03-22 21:32:03.778602 | orchestrator | 21:32:03.773 STDOUT terraform:  } 2025-03-22 21:32:03.778606 | orchestrator | 21:32:03.773 STDOUT terraform:  } 2025-03-22 21:32:03.778615 | orchestrator | 21:32:03.773 STDOUT terraform:  # openstack_compute_keypair_v2.key will be created 2025-03-22 21:32:03.778620 | orchestrator | 21:32:03.773 STDOUT terraform:  + resource "openstack_compute_keypair_v2" "key" { 2025-03-22 21:32:03.778625 | orchestrator | 21:32:03.773 STDOUT terraform:  + fingerprint = (known after apply) 2025-03-22 21:32:03.778632 | orchestrator | 21:32:03.773 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778641 | orchestrator | 21:32:03.774 STDOUT terraform:  + name = "testbed" 2025-03-22 21:32:03.778646 | orchestrator | 21:32:03.774 STDOUT terraform:  + private_key = (sensitive value) 2025-03-22 21:32:03.778650 | orchestrator | 21:32:03.774 STDOUT terraform:  + public_key = (known after apply) 2025-03-22 21:32:03.778656 | orchestrator | 21:32:03.774 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778660 | orchestrator | 21:32:03.774 STDOUT terraform:  + user_id = (known after apply) 2025-03-22 21:32:03.778665 | orchestrator | 21:32:03.774 STDOUT terraform:  } 2025-03-22 21:32:03.778670 | orchestrator | 21:32:03.774 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[0] will be created 2025-03-22 21:32:03.778675 | orchestrator | 21:32:03.774 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778680 | orchestrator | 21:32:03.774 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778685 | orchestrator | 21:32:03.774 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778690 | orchestrator | 21:32:03.774 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.778695 | orchestrator | 21:32:03.774 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778700 | orchestrator | 21:32:03.774 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.778705 | orchestrator | 21:32:03.774 STDOUT terraform:  } 2025-03-22 21:32:03.778709 | orchestrator | 21:32:03.774 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[1] will be created 2025-03-22 21:32:03.778715 | orchestrator | 21:32:03.774 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778719 | orchestrator | 21:32:03.774 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778724 | orchestrator | 21:32:03.774 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778729 | orchestrator | 21:32:03.774 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.778734 | orchestrator | 21:32:03.774 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778741 | orchestrator | 21:32:03.774 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.778746 | orchestrator | 21:32:03.774 STDOUT terraform:  } 2025-03-22 21:32:03.778751 | orchestrator | 21:32:03.774 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[2] will be created 2025-03-22 21:32:03.778756 | orchestrator | 21:32:03.774 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778761 | orchestrator | 21:32:03.774 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778766 | orchestrator | 21:32:03.774 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778771 | orchestrator | 21:32:03.774 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.778775 | orchestrator | 21:32:03.774 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778781 | orchestrator | 21:32:03.774 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.778789 | orchestrator | 21:32:03.774 STDOUT terraform:  } 2025-03-22 21:32:03.778794 | orchestrator | 21:32:03.774 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[3] will be created 2025-03-22 21:32:03.778799 | orchestrator | 21:32:03.774 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778804 | orchestrator | 21:32:03.774 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778809 | orchestrator | 21:32:03.774 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778813 | orchestrator | 21:32:03.774 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.778818 | orchestrator | 21:32:03.774 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778823 | orchestrator | 21:32:03.775 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.778831 | orchestrator | 21:32:03.775 STDOUT terraform:  } 2025-03-22 21:32:03.778847 | orchestrator | 21:32:03.775 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[4] will be created 2025-03-22 21:32:03.778853 | orchestrator | 21:32:03.775 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778859 | orchestrator | 21:32:03.775 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778864 | orchestrator | 21:32:03.775 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778869 | orchestrator | 21:32:03.775 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.778873 | orchestrator | 21:32:03.775 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778878 | orchestrator | 21:32:03.775 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.778883 | orchestrator | 21:32:03.775 STDOUT terraform:  } 2025-03-22 21:32:03.778888 | orchestrator | 21:32:03.775 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[5] will be created 2025-03-22 21:32:03.778893 | orchestrator | 21:32:03.775 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778898 | orchestrator | 21:32:03.775 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778903 | orchestrator | 21:32:03.775 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778908 | orchestrator | 21:32:03.775 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.778913 | orchestrator | 21:32:03.775 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778918 | orchestrator | 21:32:03.775 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.778923 | orchestrator | 21:32:03.775 STDOUT terraform:  } 2025-03-22 21:32:03.778928 | orchestrator | 21:32:03.775 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[6] will be created 2025-03-22 21:32:03.778935 | orchestrator | 21:32:03.775 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778941 | orchestrator | 21:32:03.775 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778946 | orchestrator | 21:32:03.775 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778950 | orchestrator | 21:32:03.775 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.778958 | orchestrator | 21:32:03.775 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.778963 | orchestrator | 21:32:03.775 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.778968 | orchestrator | 21:32:03.775 STDOUT terraform:  } 2025-03-22 21:32:03.778973 | orchestrator | 21:32:03.775 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[7] will be created 2025-03-22 21:32:03.778978 | orchestrator | 21:32:03.775 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.778983 | orchestrator | 21:32:03.775 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.778988 | orchestrator | 21:32:03.775 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.778992 | orchestrator | 21:32:03.775 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779000 | orchestrator | 21:32:03.775 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779005 | orchestrator | 21:32:03.775 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779010 | orchestrator | 21:32:03.775 STDOUT terraform:  } 2025-03-22 21:32:03.779015 | orchestrator | 21:32:03.775 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[8] will be created 2025-03-22 21:32:03.779020 | orchestrator | 21:32:03.775 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779025 | orchestrator | 21:32:03.776 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779033 | orchestrator | 21:32:03.776 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779038 | orchestrator | 21:32:03.776 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779043 | orchestrator | 21:32:03.776 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779047 | orchestrator | 21:32:03.776 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779052 | orchestrator | 21:32:03.776 STDOUT terraform:  } 2025-03-22 21:32:03.779057 | orchestrator | 21:32:03.776 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[9] will be created 2025-03-22 21:32:03.779062 | orchestrator | 21:32:03.776 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779067 | orchestrator | 21:32:03.776 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779072 | orchestrator | 21:32:03.776 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779077 | orchestrator | 21:32:03.776 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779082 | orchestrator | 21:32:03.776 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779087 | orchestrator | 21:32:03.776 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779091 | orchestrator | 21:32:03.776 STDOUT terraform:  } 2025-03-22 21:32:03.779097 | orchestrator | 21:32:03.776 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[10] will be created 2025-03-22 21:32:03.779102 | orchestrator | 21:32:03.776 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779111 | orchestrator | 21:32:03.776 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779116 | orchestrator | 21:32:03.776 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779121 | orchestrator | 21:32:03.776 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779126 | orchestrator | 21:32:03.776 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779131 | orchestrator | 21:32:03.776 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779136 | orchestrator | 21:32:03.776 STDOUT terraform:  } 2025-03-22 21:32:03.779141 | orchestrator | 21:32:03.776 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[11] will be created 2025-03-22 21:32:03.779146 | orchestrator | 21:32:03.776 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779151 | orchestrator | 21:32:03.776 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779156 | orchestrator | 21:32:03.776 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779161 | orchestrator | 21:32:03.776 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779165 | orchestrator | 21:32:03.776 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779170 | orchestrator | 21:32:03.776 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779175 | orchestrator | 21:32:03.776 STDOUT terraform:  } 2025-03-22 21:32:03.779180 | orchestrator | 21:32:03.776 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[12] will be created 2025-03-22 21:32:03.779185 | orchestrator | 21:32:03.776 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779191 | orchestrator | 21:32:03.776 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779196 | orchestrator | 21:32:03.776 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779201 | orchestrator | 21:32:03.776 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779206 | orchestrator | 21:32:03.777 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779211 | orchestrator | 21:32:03.777 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779216 | orchestrator | 21:32:03.777 STDOUT terraform:  } 2025-03-22 21:32:03.779221 | orchestrator | 21:32:03.777 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[13] will be created 2025-03-22 21:32:03.779231 | orchestrator | 21:32:03.777 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779236 | orchestrator | 21:32:03.777 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779243 | orchestrator | 21:32:03.777 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779248 | orchestrator | 21:32:03.777 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779253 | orchestrator | 21:32:03.777 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779259 | orchestrator | 21:32:03.777 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779267 | orchestrator | 21:32:03.777 STDOUT terraform:  } 2025-03-22 21:32:03.779272 | orchestrator | 21:32:03.777 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[14] will be created 2025-03-22 21:32:03.779277 | orchestrator | 21:32:03.777 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779281 | orchestrator | 21:32:03.777 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779286 | orchestrator | 21:32:03.777 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779291 | orchestrator | 21:32:03.777 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779296 | orchestrator | 21:32:03.777 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779301 | orchestrator | 21:32:03.777 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779308 | orchestrator | 21:32:03.777 STDOUT terraform:  } 2025-03-22 21:32:03.779313 | orchestrator | 21:32:03.777 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[15] will be created 2025-03-22 21:32:03.779319 | orchestrator | 21:32:03.777 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779324 | orchestrator | 21:32:03.777 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779355 | orchestrator | 21:32:03.777 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779361 | orchestrator | 21:32:03.777 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779366 | orchestrator | 21:32:03.777 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779371 | orchestrator | 21:32:03.777 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779375 | orchestrator | 21:32:03.777 STDOUT terraform:  } 2025-03-22 21:32:03.779380 | orchestrator | 21:32:03.777 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[16] will be created 2025-03-22 21:32:03.779385 | orchestrator | 21:32:03.777 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779390 | orchestrator | 21:32:03.777 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779395 | orchestrator | 21:32:03.777 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779400 | orchestrator | 21:32:03.777 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779404 | orchestrator | 21:32:03.777 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779410 | orchestrator | 21:32:03.777 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779414 | orchestrator | 21:32:03.777 STDOUT terraform:  } 2025-03-22 21:32:03.779419 | orchestrator | 21:32:03.777 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[17] will be created 2025-03-22 21:32:03.779424 | orchestrator | 21:32:03.777 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-03-22 21:32:03.779429 | orchestrator | 21:32:03.778 STDOUT terraform:  + device = (known after apply) 2025-03-22 21:32:03.779434 | orchestrator | 21:32:03.778 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779442 | orchestrator | 21:32:03.778 STDOUT terraform:  + instance_id = (known after apply) 2025-03-22 21:32:03.779449 | orchestrator | 21:32:03.778 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779454 | orchestrator | 21:32:03.778 STDOUT terraform:  + volume_id = (known after apply) 2025-03-22 21:32:03.779459 | orchestrator | 21:32:03.778 STDOUT terraform:  } 2025-03-22 21:32:03.779464 | orchestrator | 21:32:03.778 STDOUT terraform:  # openstack_networking_floatingip_associate_v2.manager_floating_ip_association will be created 2025-03-22 21:32:03.779469 | orchestrator | 21:32:03.778 STDOUT terraform:  + resource "openstack_networking_floatingip_associate_v2" "manager_floating_ip_association" { 2025-03-22 21:32:03.779474 | orchestrator | 21:32:03.778 STDOUT terraform:  + fixed_ip = (known after apply) 2025-03-22 21:32:03.779479 | orchestrator | 21:32:03.778 STDOUT terraform:  + floating_ip = (known after apply) 2025-03-22 21:32:03.779484 | orchestrator | 21:32:03.778 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779489 | orchestrator | 21:32:03.778 STDOUT terraform:  + port_id = (known after apply) 2025-03-22 21:32:03.779496 | orchestrator | 21:32:03.778 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779501 | orchestrator | 21:32:03.778 STDOUT terraform:  } 2025-03-22 21:32:03.779506 | orchestrator | 21:32:03.778 STDOUT terraform:  # openstack_networking_floatingip_v2.manager_floating_ip will be created 2025-03-22 21:32:03.779511 | orchestrator | 21:32:03.778 STDOUT terraform:  + resource "openstack_networking_floatingip_v2" "manager_floating_ip" { 2025-03-22 21:32:03.779515 | orchestrator | 21:32:03.778 STDOUT terraform:  + address = (known after apply) 2025-03-22 21:32:03.779520 | orchestrator | 21:32:03.778 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.779525 | orchestrator | 21:32:03.778 STDOUT terraform:  + dns_domain = (known after apply) 2025-03-22 21:32:03.779530 | orchestrator | 21:32:03.778 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.779535 | orchestrator | 21:32:03.778 STDOUT terraform:  + fixed_ip = (known after apply) 2025-03-22 21:32:03.779540 | orchestrator | 21:32:03.778 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779544 | orchestrator | 21:32:03.778 STDOUT terraform:  + pool = "public" 2025-03-22 21:32:03.779549 | orchestrator | 21:32:03.778 STDOUT terraform:  + port_id = (known after apply) 2025-03-22 21:32:03.779554 | orchestrator | 21:32:03.778 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.779559 | orchestrator | 21:32:03.778 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.779563 | orchestrator | 21:32:03.778 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.779569 | orchestrator | 21:32:03.778 STDOUT terraform:  } 2025-03-22 21:32:03.779573 | orchestrator | 21:32:03.778 STDOUT terraform:  # openstack_networking_network_v2.net_management will be created 2025-03-22 21:32:03.779578 | orchestrator | 21:32:03.778 STDOUT terraform:  + resource "openstack_networking_network_v2" "net_management" { 2025-03-22 21:32:03.779583 | orchestrator | 21:32:03.778 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.779588 | orchestrator | 21:32:03.778 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.779595 | orchestrator | 21:32:03.778 STDOUT terraform:  + availability_zone_hints = [ 2025-03-22 21:32:03.779601 | orchestrator | 21:32:03.778 STDOUT terraform:  + "nova", 2025-03-22 21:32:03.779605 | orchestrator | 21:32:03.778 STDOUT terraform:  ] 2025-03-22 21:32:03.779610 | orchestrator | 21:32:03.778 STDOUT terraform:  + dns_domain = (known after apply) 2025-03-22 21:32:03.779615 | orchestrator | 21:32:03.778 STDOUT terraform:  + external = (known after apply) 2025-03-22 21:32:03.779620 | orchestrator | 21:32:03.778 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.779625 | orchestrator | 21:32:03.779 STDOUT terraform:  + mtu = (known after apply) 2025-03-22 21:32:03.779630 | orchestrator | 21:32:03.779 STDOUT terraform:  + name = "net-testbed-management" 2025-03-22 21:32:03.779635 | orchestrator | 21:32:03.779 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.779642 | orchestrator | 21:32:03.779 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.781299 | orchestrator | 21:32:03.779 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.781309 | orchestrator | 21:32:03.779 STDOUT terraform:  + shared = (known after apply) 2025-03-22 21:32:03.781314 | orchestrator | 21:32:03.779 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.781319 | orchestrator | 21:32:03.779 STDOUT terraform:  + transparent_vlan = (known after apply) 2025-03-22 21:32:03.781324 | orchestrator | 21:32:03.779 STDOUT terraform:  + segments (known after apply) 2025-03-22 21:32:03.781350 | orchestrator | 21:32:03.779 STDOUT terraform:  } 2025-03-22 21:32:03.781355 | orchestrator | 21:32:03.779 STDOUT terraform:  # openstack_networking_port_v2.manager_port_management will be created 2025-03-22 21:32:03.781360 | orchestrator | 21:32:03.779 STDOUT terraform:  + resource "openstack_networking_port_v2" "manager_port_management" { 2025-03-22 21:32:03.781365 | orchestrator | 21:32:03.779 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.781370 | orchestrator | 21:32:03.779 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-22 21:32:03.781375 | orchestrator | 21:32:03.779 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-22 21:32:03.781379 | orchestrator | 21:32:03.779 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.781384 | orchestrator | 21:32:03.779 STDOUT terraform:  + device_id = (known after apply) 2025-03-22 21:32:03.781389 | orchestrator | 21:32:03.779 STDOUT terraform:  + device_owner = (known after apply) 2025-03-22 21:32:03.781394 | orchestrator | 21:32:03.779 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-22 21:32:03.781402 | orchestrator | 21:32:03.779 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.781407 | orchestrator | 21:32:03.779 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.781412 | orchestrator | 21:32:03.779 STDOUT terraform:  + mac_address = (known after apply) 2025-03-22 21:32:03.781420 | orchestrator | 21:32:03.779 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.781431 | orchestrator | 21:32:03.779 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.781435 | orchestrator | 21:32:03.779 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.781440 | orchestrator | 21:32:03.779 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.781445 | orchestrator | 21:32:03.779 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-22 21:32:03.781450 | orchestrator | 21:32:03.779 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.781455 | orchestrator | 21:32:03.779 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.781460 | orchestrator | 21:32:03.779 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-22 21:32:03.781465 | orchestrator | 21:32:03.779 STDOUT terraform:  } 2025-03-22 21:32:03.781470 | orchestrator | 21:32:03.779 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.781475 | orchestrator | 21:32:03.779 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-22 21:32:03.781479 | orchestrator | 21:32:03.780 STDOUT terraform:  } 2025-03-22 21:32:03.781484 | orchestrator | 21:32:03.780 STDOUT terraform:  + binding (known after apply) 2025-03-22 21:32:03.781489 | orchestrator | 21:32:03.780 STDOUT terraform:  + fixed_ip { 2025-03-22 21:32:03.781494 | orchestrator | 21:32:03.780 STDOUT terraform:  + ip_address = "192.168.16.5" 2025-03-22 21:32:03.781499 | orchestrator | 21:32:03.780 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.781504 | orchestrator | 21:32:03.780 STDOUT terraform:  } 2025-03-22 21:32:03.781509 | orchestrator | 21:32:03.780 STDOUT terraform:  } 2025-03-22 21:32:03.781514 | orchestrator | 21:32:03.780 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[0] will be created 2025-03-22 21:32:03.781519 | orchestrator | 21:32:03.780 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-22 21:32:03.781524 | orchestrator | 21:32:03.780 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.781529 | orchestrator | 21:32:03.780 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-22 21:32:03.781534 | orchestrator | 21:32:03.780 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-22 21:32:03.781539 | orchestrator | 21:32:03.780 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.781544 | orchestrator | 21:32:03.780 STDOUT terraform:  + device_id = (known after apply) 2025-03-22 21:32:03.781549 | orchestrator | 21:32:03.780 STDOUT terraform:  + device_owner = (known after apply) 2025-03-22 21:32:03.781553 | orchestrator | 21:32:03.780 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-22 21:32:03.781558 | orchestrator | 21:32:03.780 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.781563 | orchestrator | 21:32:03.780 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.781568 | orchestrator | 21:32:03.780 STDOUT terraform:  + mac_address = (known after apply) 2025-03-22 21:32:03.781573 | orchestrator | 21:32:03.780 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.781581 | orchestrator | 21:32:03.780 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.781586 | orchestrator | 21:32:03.780 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.781591 | orchestrator | 21:32:03.780 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.781599 | orchestrator | 21:32:03.780 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-22 21:32:03.786038 | orchestrator | 21:32:03.780 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.786054 | orchestrator | 21:32:03.780 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786060 | orchestrator | 21:32:03.780 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-22 21:32:03.786066 | orchestrator | 21:32:03.780 STDOUT terraform:  } 2025-03-22 21:32:03.786071 | orchestrator | 21:32:03.780 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786075 | orchestrator | 21:32:03.781 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-22 21:32:03.786081 | orchestrator | 21:32:03.781 STDOUT terraform:  } 2025-03-22 21:32:03.786086 | orchestrator | 21:32:03.781 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786090 | orchestrator | 21:32:03.781 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-22 21:32:03.786095 | orchestrator | 21:32:03.781 STDOUT terraform:  } 2025-03-22 21:32:03.786105 | orchestrator | 21:32:03.781 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786110 | orchestrator | 21:32:03.781 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-22 21:32:03.786115 | orchestrator | 21:32:03.781 STDOUT terraform:  } 2025-03-22 21:32:03.786123 | orchestrator | 21:32:03.781 STDOUT terraform:  + binding (known after apply) 2025-03-22 21:32:03.786128 | orchestrator | 21:32:03.781 STDOUT terraform:  + fixed_ip { 2025-03-22 21:32:03.786133 | orchestrator | 21:32:03.781 STDOUT terraform:  + ip_address = "192.168.16.10" 2025-03-22 21:32:03.786138 | orchestrator | 21:32:03.781 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.786143 | orchestrator | 21:32:03.781 STDOUT terraform:  } 2025-03-22 21:32:03.786148 | orchestrator | 21:32:03.781 STDOUT terraform:  } 2025-03-22 21:32:03.786153 | orchestrator | 21:32:03.781 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[1] will be created 2025-03-22 21:32:03.786159 | orchestrator | 21:32:03.781 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-22 21:32:03.786164 | orchestrator | 21:32:03.781 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.786169 | orchestrator | 21:32:03.781 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-22 21:32:03.786173 | orchestrator | 21:32:03.781 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-22 21:32:03.786183 | orchestrator | 21:32:03.781 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.786188 | orchestrator | 21:32:03.781 STDOUT terraform:  + device_id = (known after apply) 2025-03-22 21:32:03.786193 | orchestrator | 21:32:03.781 STDOUT terraform:  + device_owner = (known after apply) 2025-03-22 21:32:03.786204 | orchestrator | 21:32:03.782 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-22 21:32:03.786210 | orchestrator | 21:32:03.782 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.786215 | orchestrator | 21:32:03.782 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.786220 | orchestrator | 21:32:03.782 STDOUT terraform:  + mac_address = (known after apply) 2025-03-22 21:32:03.786225 | orchestrator | 21:32:03.782 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.786230 | orchestrator | 21:32:03.782 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.786234 | orchestrator | 21:32:03.782 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.786239 | orchestrator | 21:32:03.782 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.786244 | orchestrator | 21:32:03.782 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-22 21:32:03.786249 | orchestrator | 21:32:03.782 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.786254 | orchestrator | 21:32:03.782 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786259 | orchestrator | 21:32:03.782 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-22 21:32:03.786264 | orchestrator | 21:32:03.782 STDOUT terraform:  } 2025-03-22 21:32:03.786269 | orchestrator | 21:32:03.782 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786274 | orchestrator | 21:32:03.782 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-22 21:32:03.786279 | orchestrator | 21:32:03.782 STDOUT terraform:  } 2025-03-22 21:32:03.786284 | orchestrator | 21:32:03.782 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786289 | orchestrator | 21:32:03.782 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-22 21:32:03.786293 | orchestrator | 21:32:03.782 STDOUT terraform:  } 2025-03-22 21:32:03.786298 | orchestrator | 21:32:03.782 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786303 | orchestrator | 21:32:03.782 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-22 21:32:03.786308 | orchestrator | 21:32:03.782 STDOUT terraform:  } 2025-03-22 21:32:03.786317 | orchestrator | 21:32:03.782 STDOUT terraform:  + binding (known after apply) 2025-03-22 21:32:03.786322 | orchestrator | 21:32:03.782 STDOUT terraform:  + fixed_ip { 2025-03-22 21:32:03.786334 | orchestrator | 21:32:03.782 STDOUT terraform:  + ip_address = "192.168.16.11" 2025-03-22 21:32:03.786339 | orchestrator | 21:32:03.782 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.786344 | orchestrator | 21:32:03.782 STDOUT terraform:  } 2025-03-22 21:32:03.786349 | orchestrator | 21:32:03.782 STDOUT terraform:  } 2025-03-22 21:32:03.786354 | orchestrator | 21:32:03.782 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[2] will be created 2025-03-22 21:32:03.786359 | orchestrator | 21:32:03.782 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-22 21:32:03.786366 | orchestrator | 21:32:03.782 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.786371 | orchestrator | 21:32:03.782 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-22 21:32:03.786376 | orchestrator | 21:32:03.782 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-22 21:32:03.786381 | orchestrator | 21:32:03.782 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.786386 | orchestrator | 21:32:03.782 STDOUT terraform:  + device_id = (known after apply) 2025-03-22 21:32:03.786390 | orchestrator | 21:32:03.782 STDOUT terraform:  + device_owner = (known after apply) 2025-03-22 21:32:03.786395 | orchestrator | 21:32:03.782 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-22 21:32:03.786400 | orchestrator | 21:32:03.783 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.786406 | orchestrator | 21:32:03.783 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.786411 | orchestrator | 21:32:03.783 STDOUT terraform:  + mac_address = (known after apply) 2025-03-22 21:32:03.786416 | orchestrator | 21:32:03.783 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.786421 | orchestrator | 21:32:03.783 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.786426 | orchestrator | 21:32:03.783 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.786433 | orchestrator | 21:32:03.783 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.786438 | orchestrator | 21:32:03.783 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-22 21:32:03.786443 | orchestrator | 21:32:03.783 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.786447 | orchestrator | 21:32:03.783 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786452 | orchestrator | 21:32:03.783 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-22 21:32:03.786457 | orchestrator | 21:32:03.783 STDOUT terraform:  } 2025-03-22 21:32:03.786462 | orchestrator | 21:32:03.783 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786467 | orchestrator | 21:32:03.783 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-22 21:32:03.786472 | orchestrator | 21:32:03.783 STDOUT terraform:  } 2025-03-22 21:32:03.786477 | orchestrator | 21:32:03.783 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786482 | orchestrator | 21:32:03.783 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-22 21:32:03.786486 | orchestrator | 21:32:03.783 STDOUT terraform:  } 2025-03-22 21:32:03.786491 | orchestrator | 21:32:03.783 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786496 | orchestrator | 21:32:03.783 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-22 21:32:03.786501 | orchestrator | 21:32:03.783 STDOUT terraform:  } 2025-03-22 21:32:03.786506 | orchestrator | 21:32:03.783 STDOUT terraform:  + binding (known after apply) 2025-03-22 21:32:03.786511 | orchestrator | 21:32:03.783 STDOUT terraform:  + fixed_ip { 2025-03-22 21:32:03.786516 | orchestrator | 21:32:03.783 STDOUT terraform:  + ip_address = "192.168.16.12" 2025-03-22 21:32:03.786528 | orchestrator | 21:32:03.783 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.786534 | orchestrator | 21:32:03.783 STDOUT terraform:  } 2025-03-22 21:32:03.786539 | orchestrator | 21:32:03.783 STDOUT terraform:  } 2025-03-22 21:32:03.786543 | orchestrator | 21:32:03.783 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[3] will be created 2025-03-22 21:32:03.786548 | orchestrator | 21:32:03.783 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-22 21:32:03.786553 | orchestrator | 21:32:03.783 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.786558 | orchestrator | 21:32:03.783 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-22 21:32:03.786563 | orchestrator | 21:32:03.783 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-22 21:32:03.786568 | orchestrator | 21:32:03.783 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.786572 | orchestrator | 21:32:03.783 STDOUT terraform:  + device_id = (known after apply) 2025-03-22 21:32:03.786577 | orchestrator | 21:32:03.783 STDOUT terraform:  + device_owner = (known after apply) 2025-03-22 21:32:03.786582 | orchestrator | 21:32:03.783 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-22 21:32:03.786587 | orchestrator | 21:32:03.783 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.786591 | orchestrator | 21:32:03.783 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.786596 | orchestrator | 21:32:03.783 STDOUT terraform:  + mac_address = (known after apply) 2025-03-22 21:32:03.786601 | orchestrator | 21:32:03.783 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.786606 | orchestrator | 21:32:03.784 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.786611 | orchestrator | 21:32:03.784 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.786615 | orchestrator | 21:32:03.784 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.786620 | orchestrator | 21:32:03.784 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-22 21:32:03.786625 | orchestrator | 21:32:03.784 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.786630 | orchestrator | 21:32:03.784 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786635 | orchestrator | 21:32:03.784 STDOUT terraform:  2025-03-22 21:32:03.786640 | orchestrator | 21:32:03.784 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-22 21:32:03.786644 | orchestrator | 21:32:03.784 STDOUT terraform:  } 2025-03-22 21:32:03.786649 | orchestrator | 21:32:03.784 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786654 | orchestrator | 21:32:03.784 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-22 21:32:03.786659 | orchestrator | 21:32:03.784 STDOUT terraform:  } 2025-03-22 21:32:03.786682 | orchestrator | 21:32:03.784 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786687 | orchestrator | 21:32:03.784 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-22 21:32:03.786695 | orchestrator | 21:32:03.784 STDOUT terraform:  } 2025-03-22 21:32:03.786700 | orchestrator | 21:32:03.784 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786705 | orchestrator | 21:32:03.784 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-22 21:32:03.786710 | orchestrator | 21:32:03.784 STDOUT terraform:  } 2025-03-22 21:32:03.786714 | orchestrator | 21:32:03.784 STDOUT terraform:  + binding (known after apply) 2025-03-22 21:32:03.786719 | orchestrator | 21:32:03.784 STDOUT terraform:  + fixed_ip { 2025-03-22 21:32:03.786724 | orchestrator | 21:32:03.784 STDOUT terraform:  + ip_address = "192.168.16.13" 2025-03-22 21:32:03.786729 | orchestrator | 21:32:03.784 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.786734 | orchestrator | 21:32:03.784 STDOUT terraform:  } 2025-03-22 21:32:03.786741 | orchestrator | 21:32:03.784 STDOUT terraform:  } 2025-03-22 21:32:03.786749 | orchestrator | 21:32:03.784 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[4] will be created 2025-03-22 21:32:03.786754 | orchestrator | 21:32:03.784 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-22 21:32:03.786759 | orchestrator | 21:32:03.784 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.786764 | orchestrator | 21:32:03.784 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-22 21:32:03.786769 | orchestrator | 21:32:03.784 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-22 21:32:03.786773 | orchestrator | 21:32:03.784 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.786778 | orchestrator | 21:32:03.784 STDOUT terraform:  + device_id = (known after apply) 2025-03-22 21:32:03.786783 | orchestrator | 21:32:03.784 STDOUT terraform:  + device_owner = (known after apply) 2025-03-22 21:32:03.786788 | orchestrator | 21:32:03.784 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-22 21:32:03.786795 | orchestrator | 21:32:03.784 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.786800 | orchestrator | 21:32:03.784 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.786805 | orchestrator | 21:32:03.784 STDOUT terraform:  + mac_address = (known after apply) 2025-03-22 21:32:03.786810 | orchestrator | 21:32:03.784 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.786814 | orchestrator | 21:32:03.784 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.786819 | orchestrator | 21:32:03.784 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.786824 | orchestrator | 21:32:03.785 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.786829 | orchestrator | 21:32:03.785 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-22 21:32:03.786833 | orchestrator | 21:32:03.785 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.786838 | orchestrator | 21:32:03.785 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786843 | orchestrator | 21:32:03.785 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-22 21:32:03.786854 | orchestrator | 21:32:03.785 STDOUT terraform:  } 2025-03-22 21:32:03.786859 | orchestrator | 21:32:03.785 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786864 | orchestrator | 21:32:03.785 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-22 21:32:03.786869 | orchestrator | 21:32:03.785 STDOUT terraform:  } 2025-03-22 21:32:03.786874 | orchestrator | 21:32:03.785 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786878 | orchestrator | 21:32:03.785 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-22 21:32:03.786883 | orchestrator | 21:32:03.785 STDOUT terraform:  } 2025-03-22 21:32:03.786888 | orchestrator | 21:32:03.785 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.786893 | orchestrator | 21:32:03.785 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-22 21:32:03.786898 | orchestrator | 21:32:03.785 STDOUT terraform:  } 2025-03-22 21:32:03.786903 | orchestrator | 21:32:03.785 STDOUT terraform:  + binding (known after apply) 2025-03-22 21:32:03.786907 | orchestrator | 21:32:03.785 STDOUT terraform:  + fixed_ip { 2025-03-22 21:32:03.786912 | orchestrator | 21:32:03.785 STDOUT terraform:  + ip_address = "192.168.16.14" 2025-03-22 21:32:03.786917 | orchestrator | 21:32:03.785 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.786922 | orchestrator | 21:32:03.785 STDOUT terraform:  } 2025-03-22 21:32:03.786926 | orchestrator | 21:32:03.785 STDOUT terraform:  } 2025-03-22 21:32:03.786931 | orchestrator | 21:32:03.785 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[5] will be created 2025-03-22 21:32:03.786936 | orchestrator | 21:32:03.785 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-03-22 21:32:03.786944 | orchestrator | 21:32:03.785 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.786949 | orchestrator | 21:32:03.785 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-03-22 21:32:03.786955 | orchestrator | 21:32:03.785 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-03-22 21:32:03.786959 | orchestrator | 21:32:03.785 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.786964 | orchestrator | 21:32:03.785 STDOUT terraform:  + device_id = (known after apply) 2025-03-22 21:32:03.786969 | orchestrator | 21:32:03.785 STDOUT terraform:  + device_owner = (known after apply) 2025-03-22 21:32:03.786974 | orchestrator | 21:32:03.785 STDOUT terraform:  + dns_assignment = (known after apply) 2025-03-22 21:32:03.786979 | orchestrator | 21:32:03.785 STDOUT terraform:  + dns_name = (known after apply) 2025-03-22 21:32:03.786983 | orchestrator | 21:32:03.785 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.786988 | orchestrator | 21:32:03.785 STDOUT terraform:  + mac_address = (known after apply) 2025-03-22 21:32:03.786993 | orchestrator | 21:32:03.785 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.786998 | orchestrator | 21:32:03.785 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-03-22 21:32:03.787002 | orchestrator | 21:32:03.785 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-03-22 21:32:03.787010 | orchestrator | 21:32:03.785 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.787016 | orchestrator | 21:32:03.785 STDOUT terraform:  + security_group_ids = (known after apply) 2025-03-22 21:32:03.787020 | orchestrator | 21:32:03.785 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.787025 | orchestrator | 21:32:03.785 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.787032 | orchestrator | 21:32:03.785 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-03-22 21:32:03.787037 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.787042 | orchestrator | 21:32:03.786 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.787047 | orchestrator | 21:32:03.786 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-03-22 21:32:03.787052 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.787057 | orchestrator | 21:32:03.786 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.787062 | orchestrator | 21:32:03.786 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-03-22 21:32:03.787066 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.787071 | orchestrator | 21:32:03.786 STDOUT terraform:  + allowed_address_pairs { 2025-03-22 21:32:03.787076 | orchestrator | 21:32:03.786 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-03-22 21:32:03.787081 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.787086 | orchestrator | 21:32:03.786 STDOUT terraform:  + binding (known after apply) 2025-03-22 21:32:03.787091 | orchestrator | 21:32:03.786 STDOUT terraform:  + fixed_ip { 2025-03-22 21:32:03.787096 | orchestrator | 21:32:03.786 STDOUT terraform:  + ip_address = "192.168.16.15" 2025-03-22 21:32:03.787103 | orchestrator | 21:32:03.786 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.787108 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.787114 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.787119 | orchestrator | 21:32:03.786 STDOUT terraform:  # openstack_networking_router_interface_v2.router_interface will be created 2025-03-22 21:32:03.787126 | orchestrator | 21:32:03.786 STDOUT terraform:  + resource "openstack_networking_router_interface_v2" "router_interface" { 2025-03-22 21:32:03.787131 | orchestrator | 21:32:03.786 STDOUT terraform:  + force_destroy = false 2025-03-22 21:32:03.787136 | orchestrator | 21:32:03.786 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.787141 | orchestrator | 21:32:03.786 STDOUT terraform:  + port_id = (known after apply) 2025-03-22 21:32:03.787148 | orchestrator | 21:32:03.786 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790046 | orchestrator | 21:32:03.786 STDOUT terraform:  + router_id = (known after apply) 2025-03-22 21:32:03.790074 | orchestrator | 21:32:03.786 STDOUT terraform:  + subnet_id = (known after apply) 2025-03-22 21:32:03.790081 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.790087 | orchestrator | 21:32:03.786 STDOUT terraform:  # openstack_networking_router_v2.router will be created 2025-03-22 21:32:03.790099 | orchestrator | 21:32:03.786 STDOUT terraform:  + resource "openstack_networking_router_v2" "router" { 2025-03-22 21:32:03.790104 | orchestrator | 21:32:03.786 STDOUT terraform:  + admin_state_up = (known after apply) 2025-03-22 21:32:03.790109 | orchestrator | 21:32:03.786 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.790115 | orchestrator | 21:32:03.786 STDOUT terraform:  + availability_zone_hints = [ 2025-03-22 21:32:03.790120 | orchestrator | 21:32:03.786 STDOUT terraform:  + "nova", 2025-03-22 21:32:03.790125 | orchestrator | 21:32:03.786 STDOUT terraform:  ] 2025-03-22 21:32:03.790130 | orchestrator | 21:32:03.786 STDOUT terraform:  + distributed = (known after apply) 2025-03-22 21:32:03.790135 | orchestrator | 21:32:03.786 STDOUT terraform:  + enable_snat = (known after apply) 2025-03-22 21:32:03.790140 | orchestrator | 21:32:03.786 STDOUT terraform:  + external_network_id = "e6be7364-bfd8-4de7-8120-8f41c69a139a" 2025-03-22 21:32:03.790145 | orchestrator | 21:32:03.786 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790150 | orchestrator | 21:32:03.786 STDOUT terraform:  + name = "testbed" 2025-03-22 21:32:03.790155 | orchestrator | 21:32:03.786 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790160 | orchestrator | 21:32:03.786 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790165 | orchestrator | 21:32:03.786 STDOUT terraform:  + external_fixed_ip (known after apply) 2025-03-22 21:32:03.790170 | orchestrator | 21:32:03.786 STDOUT terraform:  } 2025-03-22 21:32:03.790175 | orchestrator | 21:32:03.786 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule1 will be created 2025-03-22 21:32:03.790181 | orchestrator | 21:32:03.787 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule1" { 2025-03-22 21:32:03.790186 | orchestrator | 21:32:03.787 STDOUT terraform:  + description = "ssh" 2025-03-22 21:32:03.790191 | orchestrator | 21:32:03.787 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790196 | orchestrator | 21:32:03.787 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790200 | orchestrator | 21:32:03.787 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790211 | orchestrator | 21:32:03.787 STDOUT terraform:  + port_range_max = 22 2025-03-22 21:32:03.790217 | orchestrator | 21:32:03.787 STDOUT terraform:  + port_range_min = 22 2025-03-22 21:32:03.790222 | orchestrator | 21:32:03.787 STDOUT terraform:  + protocol = "tcp" 2025-03-22 21:32:03.790226 | orchestrator | 21:32:03.787 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790231 | orchestrator | 21:32:03.787 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790236 | orchestrator | 21:32:03.787 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-22 21:32:03.790241 | orchestrator | 21:32:03.787 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790246 | orchestrator | 21:32:03.787 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790251 | orchestrator | 21:32:03.787 STDOUT terraform:  } 2025-03-22 21:32:03.790260 | orchestrator | 21:32:03.787 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule2 will be created 2025-03-22 21:32:03.790265 | orchestrator | 21:32:03.787 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule2" { 2025-03-22 21:32:03.790269 | orchestrator | 21:32:03.787 STDOUT terraform:  + description = "wireguard" 2025-03-22 21:32:03.790274 | orchestrator | 21:32:03.787 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790280 | orchestrator | 21:32:03.787 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790284 | orchestrator | 21:32:03.787 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790289 | orchestrator | 21:32:03.787 STDOUT terraform:  + port_range_max = 51820 2025-03-22 21:32:03.790294 | orchestrator | 21:32:03.787 STDOUT terraform:  + port_range_min = 51820 2025-03-22 21:32:03.790299 | orchestrator | 21:32:03.787 STDOUT terraform:  + protocol = "udp" 2025-03-22 21:32:03.790304 | orchestrator | 21:32:03.787 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790309 | orchestrator | 21:32:03.787 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790313 | orchestrator | 21:32:03.787 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-22 21:32:03.790318 | orchestrator | 21:32:03.787 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790323 | orchestrator | 21:32:03.787 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790353 | orchestrator | 21:32:03.787 STDOUT terraform:  } 2025-03-22 21:32:03.790359 | orchestrator | 21:32:03.787 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule3 will be created 2025-03-22 21:32:03.790364 | orchestrator | 21:32:03.787 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule3" { 2025-03-22 21:32:03.790369 | orchestrator | 21:32:03.787 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790373 | orchestrator | 21:32:03.787 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790378 | orchestrator | 21:32:03.787 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790383 | orchestrator | 21:32:03.787 STDOUT terraform:  + protocol = "tcp" 2025-03-22 21:32:03.790388 | orchestrator | 21:32:03.787 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790393 | orchestrator | 21:32:03.787 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790398 | orchestrator | 21:32:03.787 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-03-22 21:32:03.790403 | orchestrator | 21:32:03.787 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790408 | orchestrator | 21:32:03.787 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790413 | orchestrator | 21:32:03.787 STDOUT terraform:  } 2025-03-22 21:32:03.790418 | orchestrator | 21:32:03.788 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule4 will be created 2025-03-22 21:32:03.790428 | orchestrator | 21:32:03.788 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule4" { 2025-03-22 21:32:03.790437 | orchestrator | 21:32:03.788 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790442 | orchestrator | 21:32:03.788 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790446 | orchestrator | 21:32:03.788 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790451 | orchestrator | 21:32:03.788 STDOUT terraform:  + protocol = "udp" 2025-03-22 21:32:03.790462 | orchestrator | 21:32:03.788 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790467 | orchestrator | 21:32:03.788 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790474 | orchestrator | 21:32:03.788 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-03-22 21:32:03.790479 | orchestrator | 21:32:03.788 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790484 | orchestrator | 21:32:03.788 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790489 | orchestrator | 21:32:03.788 STDOUT terraform:  } 2025-03-22 21:32:03.790494 | orchestrator | 21:32:03.788 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule5 will be created 2025-03-22 21:32:03.790498 | orchestrator | 21:32:03.788 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule5" { 2025-03-22 21:32:03.790506 | orchestrator | 21:32:03.788 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790511 | orchestrator | 21:32:03.788 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790516 | orchestrator | 21:32:03.788 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790521 | orchestrator | 21:32:03.788 STDOUT terraform:  + protocol = "icmp" 2025-03-22 21:32:03.790526 | orchestrator | 21:32:03.788 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790530 | orchestrator | 21:32:03.788 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790535 | orchestrator | 21:32:03.788 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-22 21:32:03.790540 | orchestrator | 21:32:03.788 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790545 | orchestrator | 21:32:03.788 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790550 | orchestrator | 21:32:03.788 STDOUT terraform:  } 2025-03-22 21:32:03.790555 | orchestrator | 21:32:03.788 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule1 will be created 2025-03-22 21:32:03.790560 | orchestrator | 21:32:03.788 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule1" { 2025-03-22 21:32:03.790565 | orchestrator | 21:32:03.788 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790571 | orchestrator | 21:32:03.788 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790576 | orchestrator | 21:32:03.788 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790581 | orchestrator | 21:32:03.788 STDOUT terraform:  + protocol = "tcp" 2025-03-22 21:32:03.790586 | orchestrator | 21:32:03.788 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790593 | orchestrator | 21:32:03.788 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790599 | orchestrator | 21:32:03.788 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-22 21:32:03.790603 | orchestrator | 21:32:03.788 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790608 | orchestrator | 21:32:03.788 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790613 | orchestrator | 21:32:03.788 STDOUT terraform:  } 2025-03-22 21:32:03.790618 | orchestrator | 21:32:03.788 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule2 will be created 2025-03-22 21:32:03.790626 | orchestrator | 21:32:03.788 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule2" { 2025-03-22 21:32:03.790631 | orchestrator | 21:32:03.789 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790636 | orchestrator | 21:32:03.789 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790641 | orchestrator | 21:32:03.789 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790646 | orchestrator | 21:32:03.789 STDOUT terraform:  + protocol = "udp" 2025-03-22 21:32:03.790650 | orchestrator | 21:32:03.789 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790655 | orchestrator | 21:32:03.789 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790660 | orchestrator | 21:32:03.789 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-22 21:32:03.790665 | orchestrator | 21:32:03.789 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790670 | orchestrator | 21:32:03.789 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790675 | orchestrator | 21:32:03.789 STDOUT terraform:  } 2025-03-22 21:32:03.790680 | orchestrator | 21:32:03.789 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule3 will be created 2025-03-22 21:32:03.790685 | orchestrator | 21:32:03.789 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule3" { 2025-03-22 21:32:03.790690 | orchestrator | 21:32:03.789 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790694 | orchestrator | 21:32:03.789 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790699 | orchestrator | 21:32:03.789 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790704 | orchestrator | 21:32:03.789 STDOUT terraform:  + protocol = "icmp" 2025-03-22 21:32:03.790709 | orchestrator | 21:32:03.789 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790714 | orchestrator | 21:32:03.789 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790719 | orchestrator | 21:32:03.789 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-22 21:32:03.790723 | orchestrator | 21:32:03.789 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790728 | orchestrator | 21:32:03.789 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790733 | orchestrator | 21:32:03.789 STDOUT terraform:  } 2025-03-22 21:32:03.790738 | orchestrator | 21:32:03.789 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_rule_vrrp will be created 2025-03-22 21:32:03.790746 | orchestrator | 21:32:03.789 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_rule_vrrp" { 2025-03-22 21:32:03.790751 | orchestrator | 21:32:03.789 STDOUT terraform:  + description = "vrrp" 2025-03-22 21:32:03.790756 | orchestrator | 21:32:03.789 STDOUT terraform:  + direction = "ingress" 2025-03-22 21:32:03.790761 | orchestrator | 21:32:03.789 STDOUT terraform:  + ethertype = "IPv4" 2025-03-22 21:32:03.790765 | orchestrator | 21:32:03.789 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.790770 | orchestrator | 21:32:03.789 STDOUT terraform:  + protocol = "112" 2025-03-22 21:32:03.790775 | orchestrator | 21:32:03.789 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.790782 | orchestrator | 21:32:03.789 STDOUT terraform:  + remote_group_id = (known after apply) 2025-03-22 21:32:03.790787 | orchestrator | 21:32:03.789 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-03-22 21:32:03.790791 | orchestrator | 21:32:03.789 STDOUT terraform:  + security_group_id = (known after apply) 2025-03-22 21:32:03.790796 | orchestrator | 21:32:03.789 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.790801 | orchestrator | 21:32:03.789 STDOUT terraform:  } 2025-03-22 21:32:03.790806 | orchestrator | 21:32:03.789 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_management will be created 2025-03-22 21:32:03.790811 | orchestrator | 21:32:03.789 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_management" { 2025-03-22 21:32:03.790818 | orchestrator | 21:32:03.789 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.800058 | orchestrator | 21:32:03.789 STDOUT terraform:  + description = "management security group" 2025-03-22 21:32:03.800091 | orchestrator | 21:32:03.790 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.800097 | orchestrator | 21:32:03.790 STDOUT terraform:  + name = "testbed-management" 2025-03-22 21:32:03.800103 | orchestrator | 21:32:03.790 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.800108 | orchestrator | 21:32:03.790 STDOUT terraform:  + stateful = (known after apply) 2025-03-22 21:32:03.800113 | orchestrator | 21:32:03.790 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.800118 | orchestrator | 21:32:03.790 STDOUT terraform:  } 2025-03-22 21:32:03.800129 | orchestrator | 21:32:03.790 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_node will be created 2025-03-22 21:32:03.800135 | orchestrator | 21:32:03.790 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_node" { 2025-03-22 21:32:03.800140 | orchestrator | 21:32:03.790 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.800145 | orchestrator | 21:32:03.790 STDOUT terraform:  + description = "node security group" 2025-03-22 21:32:03.800150 | orchestrator | 21:32:03.790 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.800155 | orchestrator | 21:32:03.790 STDOUT terraform:  + name = "testbed-node" 2025-03-22 21:32:03.800160 | orchestrator | 21:32:03.790 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.800173 | orchestrator | 21:32:03.790 STDOUT terraform:  + stateful = (known after apply) 2025-03-22 21:32:03.800178 | orchestrator | 21:32:03.790 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.800183 | orchestrator | 21:32:03.790 STDOUT terraform:  } 2025-03-22 21:32:03.800188 | orchestrator | 21:32:03.790 STDOUT terraform:  # openstack_networking_subnet_v2.subnet_management will be created 2025-03-22 21:32:03.800193 | orchestrator | 21:32:03.790 STDOUT terraform:  + resource "openstack_networking_subnet_v2" "subnet_management" { 2025-03-22 21:32:03.800198 | orchestrator | 21:32:03.790 STDOUT terraform:  + all_tags = (known after apply) 2025-03-22 21:32:03.800203 | orchestrator | 21:32:03.790 STDOUT terraform:  + cidr = "192.168.16.0/20" 2025-03-22 21:32:03.800208 | orchestrator | 21:32:03.790 STDOUT terraform:  + dns_nameservers = [ 2025-03-22 21:32:03.800214 | orchestrator | 21:32:03.790 STDOUT terraform:  + "8.8.8.8", 2025-03-22 21:32:03.800218 | orchestrator | 21:32:03.790 STDOUT terraform:  + "9.9.9.9", 2025-03-22 21:32:03.800223 | orchestrator | 21:32:03.790 STDOUT terraform:  ] 2025-03-22 21:32:03.800228 | orchestrator | 21:32:03.790 STDOUT terraform:  + enable_dhcp = true 2025-03-22 21:32:03.800233 | orchestrator | 21:32:03.790 STDOUT terraform:  + gateway_ip = (known after apply) 2025-03-22 21:32:03.800245 | orchestrator | 21:32:03.790 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.944249 | orchestrator | 21:32:03.795 STDOUT terraform:  + ip_version = 4 2025-03-22 21:32:03.944335 | orchestrator | 21:32:03.795 STDOUT terraform:  + ipv6_address_mode = (known after apply) 2025-03-22 21:32:03.944344 | orchestrator | 21:32:03.795 STDOUT terraform:  + ipv6_ra_mode = (known after apply) 2025-03-22 21:32:03.944350 | orchestrator | 21:32:03.795 STDOUT terraform:  + name = "subnet-testbed-management" 2025-03-22 21:32:03.944356 | orchestrator | 21:32:03.795 STDOUT terraform:  + network_id = (known after apply) 2025-03-22 21:32:03.944361 | orchestrator | 21:32:03.795 STDOUT terraform:  + no_gateway = false 2025-03-22 21:32:03.944367 | orchestrator | 21:32:03.795 STDOUT terraform:  + region = (known after apply) 2025-03-22 21:32:03.944372 | orchestrator | 21:32:03.795 STDOUT terraform:  + service_types = (known after apply) 2025-03-22 21:32:03.944377 | orchestrator | 21:32:03.795 STDOUT terraform:  + tenant_id = (known after apply) 2025-03-22 21:32:03.944382 | orchestrator | 21:32:03.795 STDOUT terraform:  + allocation_pool { 2025-03-22 21:32:03.944387 | orchestrator | 21:32:03.795 STDOUT terraform:  + end = "192.168.31.250" 2025-03-22 21:32:03.944392 | orchestrator | 21:32:03.795 STDOUT terraform:  + start = "192.168.31.200" 2025-03-22 21:32:03.944398 | orchestrator | 21:32:03.795 STDOUT terraform:  } 2025-03-22 21:32:03.944403 | orchestrator | 21:32:03.795 STDOUT terraform:  } 2025-03-22 21:32:03.944408 | orchestrator | 21:32:03.795 STDOUT terraform:  # terraform_data.image will be created 2025-03-22 21:32:03.944416 | orchestrator | 21:32:03.795 STDOUT terraform:  + resource "terraform_data" "image" { 2025-03-22 21:32:03.944421 | orchestrator | 21:32:03.795 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.944447 | orchestrator | 21:32:03.795 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-03-22 21:32:03.944452 | orchestrator | 21:32:03.796 STDOUT terraform:  + output = (known after apply) 2025-03-22 21:32:03.944457 | orchestrator | 21:32:03.796 STDOUT terraform:  } 2025-03-22 21:32:03.944462 | orchestrator | 21:32:03.796 STDOUT terraform:  # terraform_data.image_node will be created 2025-03-22 21:32:03.944467 | orchestrator | 21:32:03.796 STDOUT terraform:  + resource "terraform_data" "image_node" { 2025-03-22 21:32:03.944472 | orchestrator | 21:32:03.796 STDOUT terraform:  + id = (known after apply) 2025-03-22 21:32:03.944476 | orchestrator | 21:32:03.796 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-03-22 21:32:03.944481 | orchestrator | 21:32:03.796 STDOUT terraform:  + output = (known after apply) 2025-03-22 21:32:03.944488 | orchestrator | 21:32:03.796 STDOUT terraform:  } 2025-03-22 21:32:03.944493 | orchestrator | 21:32:03.796 STDOUT terraform: Plan: 82 to add, 0 to change, 0 to destroy. 2025-03-22 21:32:03.944499 | orchestrator | 21:32:03.796 STDOUT terraform: Changes to Outputs: 2025-03-22 21:32:03.944504 | orchestrator | 21:32:03.796 STDOUT terraform:  + manager_address = (sensitive value) 2025-03-22 21:32:03.944509 | orchestrator | 21:32:03.796 STDOUT terraform:  + private_key = (sensitive value) 2025-03-22 21:32:03.944522 | orchestrator | 21:32:03.944 STDOUT terraform: terraform_data.image_node: Creating... 2025-03-22 21:32:03.944669 | orchestrator | 21:32:03.944 STDOUT terraform: terraform_data.image: Creating... 2025-03-22 21:32:03.944829 | orchestrator | 21:32:03.944 STDOUT terraform: terraform_data.image_node: Creation complete after 0s [id=c395724f-63f8-6cb9-62de-9054810392f9] 2025-03-22 21:32:03.944842 | orchestrator | 21:32:03.944 STDOUT terraform: terraform_data.image: Creation complete after 0s [id=acfa16bc-2e7f-3e7a-9ac1-b69fc53e3460] 2025-03-22 21:32:03.958876 | orchestrator | 21:32:03.958 STDOUT terraform: data.openstack_images_image_v2.image: Reading... 2025-03-22 21:32:03.966230 | orchestrator | 21:32:03.966 STDOUT terraform: data.openstack_images_image_v2.image_node: Reading... 2025-03-22 21:32:03.975051 | orchestrator | 21:32:03.974 STDOUT terraform: openstack_networking_network_v2.net_management: Creating... 2025-03-22 21:32:03.976305 | orchestrator | 21:32:03.976 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creating... 2025-03-22 21:32:03.980543 | orchestrator | 21:32:03.978 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creating... 2025-03-22 21:32:03.983010 | orchestrator | 21:32:03.981 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[17]: Creating... 2025-03-22 21:32:03.990592 | orchestrator | 21:32:03.982 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[15]: Creating... 2025-03-22 21:32:03.990646 | orchestrator | 21:32:03.990 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creating... 2025-03-22 21:32:04.002524 | orchestrator | 21:32:04.002 STDOUT terraform: openstack_compute_keypair_v2.key: Creating... 2025-03-22 21:32:04.002696 | orchestrator | 21:32:04.002 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creating... 2025-03-22 21:32:04.408611 | orchestrator | 21:32:04.408 STDOUT terraform: data.openstack_images_image_v2.image: Read complete after 0s [id=cd9ae1ce-c4eb-4380-9087-2aa040df6990] 2025-03-22 21:32:04.412751 | orchestrator | 21:32:04.412 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creating... 2025-03-22 21:32:04.428729 | orchestrator | 21:32:04.428 STDOUT terraform: data.openstack_images_image_v2.image_node: Read complete after 0s [id=cd9ae1ce-c4eb-4380-9087-2aa040df6990] 2025-03-22 21:32:04.434670 | orchestrator | 21:32:04.434 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[13]: Creating... 2025-03-22 21:32:04.668642 | orchestrator | 21:32:04.668 STDOUT terraform: openstack_compute_keypair_v2.key: Creation complete after 1s [id=testbed] 2025-03-22 21:32:04.679179 | orchestrator | 21:32:04.679 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[11]: Creating... 2025-03-22 21:32:09.983929 | orchestrator | 21:32:09.983 STDOUT terraform: openstack_networking_network_v2.net_management: Creation complete after 6s [id=8739686f-ff62-4fbc-9de1-739cd27a757e] 2025-03-22 21:32:09.991434 | orchestrator | 21:32:09.991 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[14]: Creating... 2025-03-22 21:32:13.978004 | orchestrator | 21:32:13.977 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Still creating... [10s elapsed] 2025-03-22 21:32:13.979029 | orchestrator | 21:32:13.978 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Still creating... [10s elapsed] 2025-03-22 21:32:13.984531 | orchestrator | 21:32:13.984 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[17]: Still creating... [10s elapsed] 2025-03-22 21:32:13.984654 | orchestrator | 21:32:13.984 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[15]: Still creating... [10s elapsed] 2025-03-22 21:32:13.991711 | orchestrator | 21:32:13.991 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Still creating... [10s elapsed] 2025-03-22 21:32:14.004016 | orchestrator | 21:32:14.003 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Still creating... [10s elapsed] 2025-03-22 21:32:14.413884 | orchestrator | 21:32:14.413 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Still creating... [10s elapsed] 2025-03-22 21:32:14.435145 | orchestrator | 21:32:14.434 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[13]: Still creating... [10s elapsed] 2025-03-22 21:32:14.575453 | orchestrator | 21:32:14.575 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creation complete after 11s [id=15af2a05-72b4-4b27-81ae-caefd85e3c33] 2025-03-22 21:32:14.583518 | orchestrator | 21:32:14.583 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[9]: Creating... 2025-03-22 21:32:14.597153 | orchestrator | 21:32:14.596 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[17]: Creation complete after 11s [id=0f58cf45-cc6c-41c9-84ae-96e36ead1340] 2025-03-22 21:32:14.604569 | orchestrator | 21:32:14.604 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creating... 2025-03-22 21:32:14.629116 | orchestrator | 21:32:14.628 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[15]: Creation complete after 11s [id=961657b8-7922-4be7-b7ea-8a6546d88057] 2025-03-22 21:32:14.633898 | orchestrator | 21:32:14.633 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creation complete after 11s [id=65c63fa5-5e6f-4c5d-b367-79c528cb404f] 2025-03-22 21:32:14.640779 | orchestrator | 21:32:14.640 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[16]: Creating... 2025-03-22 21:32:14.641789 | orchestrator | 21:32:14.641 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creation complete after 11s [id=a55e76ec-664a-47dd-9adc-a4537455a8c3] 2025-03-22 21:32:14.646388 | orchestrator | 21:32:14.645 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creating... 2025-03-22 21:32:14.647311 | orchestrator | 21:32:14.647 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[12]: Creating... 2025-03-22 21:32:14.668391 | orchestrator | 21:32:14.668 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creation complete after 11s [id=3a7b81a7-d924-4fb7-939f-2fb7c18bb110] 2025-03-22 21:32:14.671523 | orchestrator | 21:32:14.671 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creating... 2025-03-22 21:32:14.680293 | orchestrator | 21:32:14.680 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[11]: Still creating... [10s elapsed] 2025-03-22 21:32:14.701051 | orchestrator | 21:32:14.700 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creation complete after 11s [id=1ccc95e9-18f2-4ccc-b703-599850fcc056] 2025-03-22 21:32:14.706835 | orchestrator | 21:32:14.706 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creating... 2025-03-22 21:32:14.736526 | orchestrator | 21:32:14.736 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[13]: Creation complete after 11s [id=623f22ba-8f30-45a0-86a9-9fa0a67da68a] 2025-03-22 21:32:14.741981 | orchestrator | 21:32:14.741 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[10]: Creating... 2025-03-22 21:32:14.974004 | orchestrator | 21:32:14.973 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[11]: Creation complete after 10s [id=681c10dc-f1f8-4703-92fb-54cdfa604000] 2025-03-22 21:32:14.983648 | orchestrator | 21:32:14.983 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creating... 2025-03-22 21:32:19.994627 | orchestrator | 21:32:19.994 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[14]: Still creating... [10s elapsed] 2025-03-22 21:32:20.198288 | orchestrator | 21:32:20.197 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[14]: Creation complete after 10s [id=b44243bb-e871-4c49-9391-421f3d942d35] 2025-03-22 21:32:20.205591 | orchestrator | 21:32:20.205 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creating... 2025-03-22 21:32:24.584420 | orchestrator | 21:32:24.584 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[9]: Still creating... [10s elapsed] 2025-03-22 21:32:24.605601 | orchestrator | 21:32:24.605 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Still creating... [10s elapsed] 2025-03-22 21:32:24.704818 | orchestrator | 21:32:24.641 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[16]: Still creating... [10s elapsed] 2025-03-22 21:32:24.708202 | orchestrator | 21:32:24.646 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Still creating... [10s elapsed] 2025-03-22 21:32:24.708242 | orchestrator | 21:32:24.649 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[12]: Still creating... [10s elapsed] 2025-03-22 21:32:24.708257 | orchestrator | 21:32:24.672 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Still creating... [10s elapsed] 2025-03-22 21:32:24.708278 | orchestrator | 21:32:24.708 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Still creating... [10s elapsed] 2025-03-22 21:32:24.743523 | orchestrator | 21:32:24.743 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[10]: Still creating... [10s elapsed] 2025-03-22 21:32:24.827416 | orchestrator | 21:32:24.827 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creation complete after 10s [id=57715f6b-5cd5-487b-a330-51d69b0efe09] 2025-03-22 21:32:24.985527 | orchestrator | 21:32:24.985 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Still creating... [10s elapsed] 2025-03-22 21:32:25.226894 | orchestrator | 21:32:25.226 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[9]: Creation complete after 10s [id=ffd44099-c386-47c0-8dc0-30cf9a71e0b5] 2025-03-22 21:32:25.228556 | orchestrator | 21:32:25.228 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creation complete after 10s [id=637f4fb2-465d-4aa6-a08d-716b8ef59fde] 2025-03-22 21:32:25.228655 | orchestrator | 21:32:25.228 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[12]: Creation complete after 10s [id=e7ab856b-7bea-4042-bcdb-2564fd7799a3] 2025-03-22 21:32:25.231588 | orchestrator | 21:32:25.230 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[10]: Creation complete after 10s [id=6454730d-d769-486b-8e2e-775b81470741] 2025-03-22 21:32:25.232421 | orchestrator | 21:32:25.230 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creation complete after 10s [id=9647d6dd-7a49-4e0c-bf6a-16b92e91fe66] 2025-03-22 21:32:25.232511 | orchestrator | 21:32:25.231 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creation complete after 10s [id=31a214af-07d5-4ac8-a8e1-81b91b42d6ba] 2025-03-22 21:32:25.234271 | orchestrator | 21:32:25.233 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[16]: Creation complete after 10s [id=d6bc4934-4c34-4893-bb71-8a867393eb36] 2025-03-22 21:32:25.243440 | orchestrator | 21:32:25.243 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creating... 2025-03-22 21:32:25.250609 | orchestrator | 21:32:25.250 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creating... 2025-03-22 21:32:25.251281 | orchestrator | 21:32:25.251 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creating... 2025-03-22 21:32:25.253052 | orchestrator | 21:32:25.252 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creating... 2025-03-22 21:32:25.267993 | orchestrator | 21:32:25.265 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creating... 2025-03-22 21:32:25.277320 | orchestrator | 21:32:25.265 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creating... 2025-03-22 21:32:25.277409 | orchestrator | 21:32:25.265 STDOUT terraform: local_sensitive_file.id_rsa: Creating... 2025-03-22 21:32:25.277432 | orchestrator | 21:32:25.277 STDOUT terraform: local_file.id_rsa_pub: Creating... 2025-03-22 21:32:25.286526 | orchestrator | 21:32:25.286 STDOUT terraform: local_sensitive_file.id_rsa: Creation complete after 0s [id=4a319edd46628a0aa2d6d13ab9ab2ef0cd27b80d] 2025-03-22 21:32:25.286830 | orchestrator | 21:32:25.286 STDOUT terraform: local_file.id_rsa_pub: Creation complete after 0s [id=55fa2fd144c8ba52def64afd67c6093e55496cfe] 2025-03-22 21:32:25.389440 | orchestrator | 21:32:25.389 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creation complete after 10s [id=7e6891cd-4286-464f-b7ae-c733703cc022] 2025-03-22 21:32:30.207070 | orchestrator | 21:32:30.206 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Still creating... [10s elapsed] 2025-03-22 21:32:30.569308 | orchestrator | 21:32:30.569 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creation complete after 11s [id=3d5567ec-4515-456e-abea-12b86672470c] 2025-03-22 21:32:31.080871 | orchestrator | 21:32:31.080 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creation complete after 6s [id=50da14c9-20d9-4660-8c1a-b577fda1c4ac] 2025-03-22 21:32:31.088775 | orchestrator | 21:32:31.088 STDOUT terraform: openstack_networking_router_v2.router: Creating... 2025-03-22 21:32:35.247015 | orchestrator | 21:32:35.246 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Still creating... [10s elapsed] 2025-03-22 21:32:35.251724 | orchestrator | 21:32:35.251 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Still creating... [10s elapsed] 2025-03-22 21:32:35.253992 | orchestrator | 21:32:35.253 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Still creating... [10s elapsed] 2025-03-22 21:32:35.258369 | orchestrator | 21:32:35.258 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Still creating... [10s elapsed] 2025-03-22 21:32:35.265769 | orchestrator | 21:32:35.265 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Still creating... [10s elapsed] 2025-03-22 21:32:35.648047 | orchestrator | 21:32:35.647 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creation complete after 11s [id=f0064d2e-b937-487e-87eb-c5dccc0148b3] 2025-03-22 21:32:35.705812 | orchestrator | 21:32:35.705 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creation complete after 11s [id=d5017ffd-d0d5-431d-84a2-17c0a06b39b8] 2025-03-22 21:32:35.716530 | orchestrator | 21:32:35.716 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creation complete after 11s [id=2497086c-bea7-41d7-87d9-2d5542cb5404] 2025-03-22 21:32:35.721679 | orchestrator | 21:32:35.721 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creation complete after 11s [id=4834743c-886d-4eac-bb1c-63811c466f92] 2025-03-22 21:32:35.740896 | orchestrator | 21:32:35.740 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creation complete after 11s [id=b23a3080-52c6-488b-8c5e-d7619a688699] 2025-03-22 21:32:38.684772 | orchestrator | 21:32:38.684 STDOUT terraform: openstack_networking_router_v2.router: Creation complete after 8s [id=b15686d6-a5aa-434c-9c0b-5b0275ec677b] 2025-03-22 21:32:38.695131 | orchestrator | 21:32:38.694 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creating... 2025-03-22 21:32:38.695915 | orchestrator | 21:32:38.695 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creating... 2025-03-22 21:32:38.696669 | orchestrator | 21:32:38.696 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creating... 2025-03-22 21:32:38.812564 | orchestrator | 21:32:38.811 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creation complete after 0s [id=d8713ed5-f3b3-448f-a098-5502b48f24d1] 2025-03-22 21:32:38.819290 | orchestrator | 21:32:38.819 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creating... 2025-03-22 21:32:38.819752 | orchestrator | 21:32:38.819 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creating... 2025-03-22 21:32:38.822871 | orchestrator | 21:32:38.822 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creating... 2025-03-22 21:32:38.822963 | orchestrator | 21:32:38.822 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creating... 2025-03-22 21:32:38.823019 | orchestrator | 21:32:38.822 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creation complete after 0s [id=1b223cba-17fc-4f58-b04d-1aaf3682d73a] 2025-03-22 21:32:38.823441 | orchestrator | 21:32:38.823 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creating... 2025-03-22 21:32:38.837042 | orchestrator | 21:32:38.836 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creating... 2025-03-22 21:32:38.837518 | orchestrator | 21:32:38.837 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creating... 2025-03-22 21:32:38.839235 | orchestrator | 21:32:38.839 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creating... 2025-03-22 21:32:38.842007 | orchestrator | 21:32:38.841 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creating... 2025-03-22 21:32:38.998766 | orchestrator | 21:32:38.998 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creation complete after 0s [id=957b0c43-19b5-495b-9121-3d8b9b38af21] 2025-03-22 21:32:39.006633 | orchestrator | 21:32:39.006 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creating... 2025-03-22 21:32:39.052820 | orchestrator | 21:32:39.052 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creation complete after 0s [id=75cf988f-e375-48da-b086-165d53293826] 2025-03-22 21:32:39.058612 | orchestrator | 21:32:39.058 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creating... 2025-03-22 21:32:39.158101 | orchestrator | 21:32:39.157 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creation complete after 0s [id=c3303db9-9fa2-4062-b294-2437f487432e] 2025-03-22 21:32:39.172613 | orchestrator | 21:32:39.172 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creating... 2025-03-22 21:32:39.207982 | orchestrator | 21:32:39.207 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creation complete after 0s [id=80dfe9ff-079e-4548-a096-348d102077e5] 2025-03-22 21:32:39.220146 | orchestrator | 21:32:39.219 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creating... 2025-03-22 21:32:39.317229 | orchestrator | 21:32:39.316 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creation complete after 0s [id=2b233922-a58d-43a3-87f5-6d3fbe259f64] 2025-03-22 21:32:39.329705 | orchestrator | 21:32:39.329 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creating... 2025-03-22 21:32:39.404442 | orchestrator | 21:32:39.404 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creation complete after 0s [id=07444011-81fc-4704-9925-ce79963b5fea] 2025-03-22 21:32:39.418125 | orchestrator | 21:32:39.417 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creating... 2025-03-22 21:32:39.473454 | orchestrator | 21:32:39.472 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creation complete after 0s [id=a619db98-9370-48ca-9de0-9ab87867e921] 2025-03-22 21:32:39.486650 | orchestrator | 21:32:39.486 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creating... 2025-03-22 21:32:39.536682 | orchestrator | 21:32:39.536 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creation complete after 1s [id=1269bc90-8b17-442f-8e55-4404c4fa4f2e] 2025-03-22 21:32:39.648109 | orchestrator | 21:32:39.647 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creation complete after 1s [id=2bafd477-f5da-412d-8157-a4103a604c66] 2025-03-22 21:32:44.633167 | orchestrator | 21:32:44.632 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creation complete after 6s [id=dea9137d-e420-47df-b702-87c71c47428d] 2025-03-22 21:32:44.968733 | orchestrator | 21:32:44.968 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creation complete after 6s [id=5bbd6255-0712-4443-b855-56173551c8a6] 2025-03-22 21:32:45.054749 | orchestrator | 21:32:45.054 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creation complete after 6s [id=2aab26da-cbfa-478c-b70e-9d26083f2a32] 2025-03-22 21:32:45.166064 | orchestrator | 21:32:45.165 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creation complete after 6s [id=9a373260-f4a9-48a3-8e17-72a8fd156a46] 2025-03-22 21:32:45.279239 | orchestrator | 21:32:45.278 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creation complete after 6s [id=8288d4ef-f8ca-45e5-9861-b259c653c846] 2025-03-22 21:32:45.336095 | orchestrator | 21:32:45.335 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creation complete after 6s [id=008a2b55-616b-48c1-97c9-c2abcfcd2feb] 2025-03-22 21:32:45.442117 | orchestrator | 21:32:45.441 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creation complete after 6s [id=10567004-35af-408c-93f4-19e5a91f848f] 2025-03-22 21:32:45.448356 | orchestrator | 21:32:45.448 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creating... 2025-03-22 21:32:45.792468 | orchestrator | 21:32:45.792 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creation complete after 7s [id=e8504ac8-6108-487e-9fab-d053bc6b73f7] 2025-03-22 21:32:45.824692 | orchestrator | 21:32:45.824 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creating... 2025-03-22 21:32:45.827879 | orchestrator | 21:32:45.827 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creating... 2025-03-22 21:32:45.833216 | orchestrator | 21:32:45.833 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creating... 2025-03-22 21:32:45.833838 | orchestrator | 21:32:45.833 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creating... 2025-03-22 21:32:45.836155 | orchestrator | 21:32:45.835 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creating... 2025-03-22 21:32:45.842353 | orchestrator | 21:32:45.842 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creating... 2025-03-22 21:32:51.901421 | orchestrator | 21:32:51.901 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creation complete after 7s [id=9b05d685-55c2-40eb-bc76-6d0e4f4d7a1b] 2025-03-22 21:32:51.914743 | orchestrator | 21:32:51.914 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creating... 2025-03-22 21:32:51.915156 | orchestrator | 21:32:51.915 STDOUT terraform: local_file.inventory: Creating... 2025-03-22 21:32:51.916205 | orchestrator | 21:32:51.916 STDOUT terraform: local_file.MANAGER_ADDRESS: Creating... 2025-03-22 21:32:51.918925 | orchestrator | 21:32:51.918 STDOUT terraform: local_file.inventory: Creation complete after 0s [id=89822f4adbda33a6eb642287127f3daf5038d4ec] 2025-03-22 21:32:51.920231 | orchestrator | 21:32:51.920 STDOUT terraform: local_file.MANAGER_ADDRESS: Creation complete after 0s [id=7de503adc88806ba5182cd4fbde093caa76cdfd1] 2025-03-22 21:32:52.509035 | orchestrator | 21:32:52.508 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creation complete after 1s [id=9b05d685-55c2-40eb-bc76-6d0e4f4d7a1b] 2025-03-22 21:32:55.825952 | orchestrator | 21:32:55.825 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [10s elapsed] 2025-03-22 21:32:55.835007 | orchestrator | 21:32:55.834 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [10s elapsed] 2025-03-22 21:32:55.835069 | orchestrator | 21:32:55.834 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [10s elapsed] 2025-03-22 21:32:55.838327 | orchestrator | 21:32:55.838 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [10s elapsed] 2025-03-22 21:32:55.843508 | orchestrator | 21:32:55.843 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [10s elapsed] 2025-03-22 21:32:55.843648 | orchestrator | 21:32:55.843 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [10s elapsed] 2025-03-22 21:33:05.826608 | orchestrator | 21:33:05.826 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [20s elapsed] 2025-03-22 21:33:05.835605 | orchestrator | 21:33:05.835 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [20s elapsed] 2025-03-22 21:33:05.835720 | orchestrator | 21:33:05.835 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [20s elapsed] 2025-03-22 21:33:05.838795 | orchestrator | 21:33:05.838 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [20s elapsed] 2025-03-22 21:33:05.844089 | orchestrator | 21:33:05.843 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [20s elapsed] 2025-03-22 21:33:05.844211 | orchestrator | 21:33:05.844 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [20s elapsed] 2025-03-22 21:33:06.425537 | orchestrator | 21:33:06.425 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creation complete after 20s [id=05833788-530d-488f-86c4-46a1785844ea] 2025-03-22 21:33:07.338517 | orchestrator | 21:33:07.337 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creation complete after 21s [id=0b5a2b55-7ee4-439e-a160-e29609461027] 2025-03-22 21:33:15.830007 | orchestrator | 21:33:15.829 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [30s elapsed] 2025-03-22 21:33:15.836506 | orchestrator | 21:33:15.836 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [30s elapsed] 2025-03-22 21:33:15.836638 | orchestrator | 21:33:15.836 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [30s elapsed] 2025-03-22 21:33:15.839122 | orchestrator | 21:33:15.838 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [30s elapsed] 2025-03-22 21:33:16.395169 | orchestrator | 21:33:16.394 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creation complete after 30s [id=5d77a14a-54ec-402c-b409-1d16a8d54867] 2025-03-22 21:33:16.399564 | orchestrator | 21:33:16.399 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creation complete after 30s [id=1644a613-4421-4861-8a5b-f9e3c9d7f9d1] 2025-03-22 21:33:16.409661 | orchestrator | 21:33:16.409 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creation complete after 30s [id=83440160-7c90-420f-8331-aacbf5dc9ba6] 2025-03-22 21:33:16.455793 | orchestrator | 21:33:16.455 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creation complete after 30s [id=3404138a-5aeb-444b-b6ca-c8ff3a9e6349] 2025-03-22 21:33:16.463770 | orchestrator | 21:33:16.463 STDOUT terraform: null_resource.node_semaphore: Creating... 2025-03-22 21:33:16.469745 | orchestrator | 21:33:16.469 STDOUT terraform: null_resource.node_semaphore: Creation complete after 0s [id=6392430373645277085] 2025-03-22 21:33:16.480215 | orchestrator | 21:33:16.480 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creating... 2025-03-22 21:33:16.482677 | orchestrator | 21:33:16.482 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[14]: Creating... 2025-03-22 21:33:16.487334 | orchestrator | 21:33:16.487 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creating... 2025-03-22 21:33:16.489206 | orchestrator | 21:33:16.489 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[12]: Creating... 2025-03-22 21:33:16.502193 | orchestrator | 21:33:16.501 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creating... 2025-03-22 21:33:16.507235 | orchestrator | 21:33:16.502 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[9]: Creating... 2025-03-22 21:33:16.507293 | orchestrator | 21:33:16.507 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[15]: Creating... 2025-03-22 21:33:16.514539 | orchestrator | 21:33:16.514 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creating... 2025-03-22 21:33:16.521849 | orchestrator | 21:33:16.521 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[13]: Creating... 2025-03-22 21:33:16.521891 | orchestrator | 21:33:16.521 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creating... 2025-03-22 21:33:21.943147 | orchestrator | 21:33:21.942 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[14]: Creation complete after 6s [id=1644a613-4421-4861-8a5b-f9e3c9d7f9d1/b44243bb-e871-4c49-9391-421f3d942d35] 2025-03-22 21:33:21.953612 | orchestrator | 21:33:21.953 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creation complete after 6s [id=5d77a14a-54ec-402c-b409-1d16a8d54867/9647d6dd-7a49-4e0c-bf6a-16b92e91fe66] 2025-03-22 21:33:21.958773 | orchestrator | 21:33:21.958 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[17]: Creating... 2025-03-22 21:33:21.965979 | orchestrator | 21:33:21.965 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creating... 2025-03-22 21:33:21.975171 | orchestrator | 21:33:21.974 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[9]: Creation complete after 5s [id=05833788-530d-488f-86c4-46a1785844ea/ffd44099-c386-47c0-8dc0-30cf9a71e0b5] 2025-03-22 21:33:21.977897 | orchestrator | 21:33:21.977 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[12]: Creation complete after 6s [id=0b5a2b55-7ee4-439e-a160-e29609461027/e7ab856b-7bea-4042-bcdb-2564fd7799a3] 2025-03-22 21:33:21.983816 | orchestrator | 21:33:21.983 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[11]: Creating... 2025-03-22 21:33:21.989438 | orchestrator | 21:33:21.989 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creating... 2025-03-22 21:33:21.999231 | orchestrator | 21:33:21.999 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creation complete after 6s [id=1644a613-4421-4861-8a5b-f9e3c9d7f9d1/1ccc95e9-18f2-4ccc-b703-599850fcc056] 2025-03-22 21:33:22.011013 | orchestrator | 21:33:22.010 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[13]: Creation complete after 5s [id=83440160-7c90-420f-8331-aacbf5dc9ba6/623f22ba-8f30-45a0-86a9-9fa0a67da68a] 2025-03-22 21:33:22.013949 | orchestrator | 21:33:22.013 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creating... 2025-03-22 21:33:22.026913 | orchestrator | 21:33:22.026 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[10]: Creating... 2025-03-22 21:33:22.031308 | orchestrator | 21:33:22.031 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[15]: Creation complete after 5s [id=05833788-530d-488f-86c4-46a1785844ea/961657b8-7922-4be7-b7ea-8a6546d88057] 2025-03-22 21:33:22.037584 | orchestrator | 21:33:22.037 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creation complete after 5s [id=0b5a2b55-7ee4-439e-a160-e29609461027/15af2a05-72b4-4b27-81ae-caefd85e3c33] 2025-03-22 21:33:22.038060 | orchestrator | 21:33:22.037 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[16]: Creating... 2025-03-22 21:33:22.047861 | orchestrator | 21:33:22.047 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creating... 2025-03-22 21:33:22.049130 | orchestrator | 21:33:22.048 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creation complete after 5s [id=83440160-7c90-420f-8331-aacbf5dc9ba6/57715f6b-5cd5-487b-a330-51d69b0efe09] 2025-03-22 21:33:22.064161 | orchestrator | 21:33:22.064 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creating... 2025-03-22 21:33:22.098803 | orchestrator | 21:33:22.098 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creation complete after 6s [id=05833788-530d-488f-86c4-46a1785844ea/637f4fb2-465d-4aa6-a08d-716b8ef59fde] 2025-03-22 21:33:27.400129 | orchestrator | 21:33:27.399 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creation complete after 5s [id=3404138a-5aeb-444b-b6ca-c8ff3a9e6349/65c63fa5-5e6f-4c5d-b367-79c528cb404f] 2025-03-22 21:33:27.429641 | orchestrator | 21:33:27.429 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creation complete after 5s [id=0b5a2b55-7ee4-439e-a160-e29609461027/3a7b81a7-d924-4fb7-939f-2fb7c18bb110] 2025-03-22 21:33:27.431526 | orchestrator | 21:33:27.431 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creation complete after 5s [id=1644a613-4421-4861-8a5b-f9e3c9d7f9d1/a55e76ec-664a-47dd-9adc-a4537455a8c3] 2025-03-22 21:33:27.448312 | orchestrator | 21:33:27.447 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[17]: Creation complete after 5s [id=5d77a14a-54ec-402c-b409-1d16a8d54867/0f58cf45-cc6c-41c9-84ae-96e36ead1340] 2025-03-22 21:33:27.467010 | orchestrator | 21:33:27.466 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[10]: Creation complete after 5s [id=3404138a-5aeb-444b-b6ca-c8ff3a9e6349/6454730d-d769-486b-8e2e-775b81470741] 2025-03-22 21:33:27.472000 | orchestrator | 21:33:27.471 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[11]: Creation complete after 5s [id=5d77a14a-54ec-402c-b409-1d16a8d54867/681c10dc-f1f8-4703-92fb-54cdfa604000] 2025-03-22 21:33:27.484648 | orchestrator | 21:33:27.484 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[16]: Creation complete after 5s [id=3404138a-5aeb-444b-b6ca-c8ff3a9e6349/d6bc4934-4c34-4893-bb71-8a867393eb36] 2025-03-22 21:33:27.496256 | orchestrator | 21:33:27.495 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creation complete after 5s [id=83440160-7c90-420f-8331-aacbf5dc9ba6/31a214af-07d5-4ac8-a8e1-81b91b42d6ba] 2025-03-22 21:33:32.066565 | orchestrator | 21:33:32.066 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [10s elapsed] 2025-03-22 21:33:42.071725 | orchestrator | 21:33:42.071 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [20s elapsed] 2025-03-22 21:33:42.595537 | orchestrator | 21:33:42.595 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creation complete after 21s [id=79fdb12d-c880-410d-a164-7b82a413a120] 2025-03-22 21:33:42.612473 | orchestrator | 21:33:42.612 STDOUT terraform: Apply complete! Resources: 82 added, 0 changed, 0 destroyed. 2025-03-22 21:33:42.612677 | orchestrator | 21:33:42.612 STDOUT terraform: Outputs: 2025-03-22 21:33:42.612846 | orchestrator | 21:33:42.612 STDOUT terraform: manager_address = 2025-03-22 21:33:42.621737 | orchestrator | 21:33:42.612 STDOUT terraform: private_key = 2025-03-22 21:33:52.760222 | orchestrator | changed 2025-03-22 21:33:52.797523 | 2025-03-22 21:33:52.797640 | TASK [Fetch manager address] 2025-03-22 21:33:53.197511 | orchestrator | ok 2025-03-22 21:33:53.210433 | 2025-03-22 21:33:53.210565 | TASK [Set manager_host address] 2025-03-22 21:33:53.310707 | orchestrator | ok 2025-03-22 21:33:53.326474 | 2025-03-22 21:33:53.326582 | LOOP [Update ansible collections] 2025-03-22 21:33:54.079259 | orchestrator | changed 2025-03-22 21:33:54.806351 | orchestrator | changed 2025-03-22 21:33:54.833868 | 2025-03-22 21:33:54.834014 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-03-22 21:34:05.337826 | orchestrator | ok 2025-03-22 21:34:05.352548 | 2025-03-22 21:34:05.352673 | TASK [Wait a little longer for the manager so that everything is ready] 2025-03-22 21:35:05.407175 | orchestrator | ok 2025-03-22 21:35:05.418275 | 2025-03-22 21:35:05.418426 | TASK [Fetch manager ssh hostkey] 2025-03-22 21:35:06.462763 | orchestrator | Output suppressed because no_log was given 2025-03-22 21:35:06.476275 | 2025-03-22 21:35:06.476507 | TASK [Get ssh keypair from terraform environment] 2025-03-22 21:35:07.052152 | orchestrator | changed 2025-03-22 21:35:07.065986 | 2025-03-22 21:35:07.066111 | TASK [Point out that the following task takes some time and does not give any output] 2025-03-22 21:35:07.108738 | orchestrator | ok: The task 'Run manager part 0' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minutes for this task to complete. 2025-03-22 21:35:07.120153 | 2025-03-22 21:35:07.120267 | TASK [Run manager part 0] 2025-03-22 21:35:07.949984 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-03-22 21:35:07.990203 | orchestrator | 2025-03-22 21:35:09.826987 | orchestrator | PLAY [Wait for cloud-init to finish] ******************************************* 2025-03-22 21:35:09.827039 | orchestrator | 2025-03-22 21:35:09.827059 | orchestrator | TASK [Check /var/lib/cloud/instance/boot-finished] ***************************** 2025-03-22 21:35:09.827075 | orchestrator | ok: [testbed-manager] 2025-03-22 21:35:11.713955 | orchestrator | 2025-03-22 21:35:11.713997 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-03-22 21:35:11.714007 | orchestrator | 2025-03-22 21:35:11.714032 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:35:11.714043 | orchestrator | ok: [testbed-manager] 2025-03-22 21:35:12.333025 | orchestrator | 2025-03-22 21:35:12.333093 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-03-22 21:35:12.333122 | orchestrator | ok: [testbed-manager] 2025-03-22 21:35:12.373262 | orchestrator | 2025-03-22 21:35:12.373299 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-03-22 21:35:12.373312 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:35:12.392399 | orchestrator | 2025-03-22 21:35:12.392427 | orchestrator | TASK [Update package cache] **************************************************** 2025-03-22 21:35:12.392437 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:35:12.409730 | orchestrator | 2025-03-22 21:35:12.409754 | orchestrator | TASK [Install required packages] *********************************************** 2025-03-22 21:35:12.409764 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:35:12.427010 | orchestrator | 2025-03-22 21:35:12.427033 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-03-22 21:35:12.427044 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:35:12.444914 | orchestrator | 2025-03-22 21:35:12.444936 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-03-22 21:35:12.444946 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:35:12.464008 | orchestrator | 2025-03-22 21:35:12.464032 | orchestrator | TASK [Fail if Ubuntu version is lower than 22.04] ****************************** 2025-03-22 21:35:12.464042 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:35:12.483258 | orchestrator | 2025-03-22 21:35:12.483280 | orchestrator | TASK [Fail if Debian version is lower than 12] ********************************* 2025-03-22 21:35:12.483291 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:35:13.334110 | orchestrator | 2025-03-22 21:35:13.334145 | orchestrator | TASK [Set APT options on manager] ********************************************** 2025-03-22 21:35:13.334159 | orchestrator | changed: [testbed-manager] 2025-03-22 21:37:53.170926 | orchestrator | 2025-03-22 21:37:53.171042 | orchestrator | TASK [Update APT cache and run dist-upgrade] *********************************** 2025-03-22 21:37:53.171092 | orchestrator | changed: [testbed-manager] 2025-03-22 21:39:18.447744 | orchestrator | 2025-03-22 21:39:18.447804 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-03-22 21:39:18.447823 | orchestrator | changed: [testbed-manager] 2025-03-22 21:39:40.316700 | orchestrator | 2025-03-22 21:39:40.316816 | orchestrator | TASK [Install required packages] *********************************************** 2025-03-22 21:39:40.316854 | orchestrator | changed: [testbed-manager] 2025-03-22 21:39:50.038470 | orchestrator | 2025-03-22 21:39:50.038581 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-03-22 21:39:50.038615 | orchestrator | changed: [testbed-manager] 2025-03-22 21:39:50.085612 | orchestrator | 2025-03-22 21:39:50.085688 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-03-22 21:39:50.085725 | orchestrator | ok: [testbed-manager] 2025-03-22 21:39:50.897222 | orchestrator | 2025-03-22 21:39:50.897319 | orchestrator | TASK [Get current user] ******************************************************** 2025-03-22 21:39:50.897352 | orchestrator | ok: [testbed-manager] 2025-03-22 21:39:51.645455 | orchestrator | 2025-03-22 21:39:51.645537 | orchestrator | TASK [Create venv directory] *************************************************** 2025-03-22 21:39:51.645575 | orchestrator | changed: [testbed-manager] 2025-03-22 21:39:59.472081 | orchestrator | 2025-03-22 21:39:59.472194 | orchestrator | TASK [Install netaddr in venv] ************************************************* 2025-03-22 21:39:59.472232 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:06.514471 | orchestrator | 2025-03-22 21:40:06.514604 | orchestrator | TASK [Install ansible-core in venv] ******************************************** 2025-03-22 21:40:06.514661 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:09.641959 | orchestrator | 2025-03-22 21:40:09.642084 | orchestrator | TASK [Install requests >= 2.32.2] ********************************************** 2025-03-22 21:40:09.642116 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:11.653372 | orchestrator | 2025-03-22 21:40:11.653481 | orchestrator | TASK [Install docker >= 7.1.0] ************************************************* 2025-03-22 21:40:11.653514 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:12.834375 | orchestrator | 2025-03-22 21:40:12.834492 | orchestrator | TASK [Create directories in /opt/src] ****************************************** 2025-03-22 21:40:12.834529 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-03-22 21:40:12.876285 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-03-22 21:40:12.876357 | orchestrator | 2025-03-22 21:40:12.876375 | orchestrator | TASK [Sync sources in /opt/src] ************************************************ 2025-03-22 21:40:12.876424 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-03-22 21:40:16.031553 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-03-22 21:40:16.031603 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-03-22 21:40:16.031613 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-03-22 21:40:16.031628 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-03-22 21:40:16.601745 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-03-22 21:40:16.601839 | orchestrator | 2025-03-22 21:40:16.601859 | orchestrator | TASK [Create /usr/share/ansible directory] ************************************* 2025-03-22 21:40:16.601888 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:40.329784 | orchestrator | 2025-03-22 21:40:40.329839 | orchestrator | TASK [Install collections from Ansible galaxy] ********************************* 2025-03-22 21:40:40.329859 | orchestrator | changed: [testbed-manager] => (item=ansible.netcommon) 2025-03-22 21:40:42.950640 | orchestrator | changed: [testbed-manager] => (item=ansible.posix) 2025-03-22 21:40:42.950681 | orchestrator | changed: [testbed-manager] => (item=community.docker>=3.10.2) 2025-03-22 21:40:42.950688 | orchestrator | 2025-03-22 21:40:42.950695 | orchestrator | TASK [Install local collections] *********************************************** 2025-03-22 21:40:42.950707 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-commons) 2025-03-22 21:40:44.405136 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-services) 2025-03-22 21:40:44.405234 | orchestrator | 2025-03-22 21:40:44.405252 | orchestrator | PLAY [Create operator user] **************************************************** 2025-03-22 21:40:44.405267 | orchestrator | 2025-03-22 21:40:44.405282 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:40:44.405310 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:44.450339 | orchestrator | 2025-03-22 21:40:44.450392 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-03-22 21:40:44.450434 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:44.509615 | orchestrator | 2025-03-22 21:40:44.509666 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-03-22 21:40:44.509681 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:45.303766 | orchestrator | 2025-03-22 21:40:45.440255 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-03-22 21:40:45.440336 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:46.081135 | orchestrator | 2025-03-22 21:40:46.350667 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-03-22 21:40:46.350758 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:47.709566 | orchestrator | 2025-03-22 21:40:47.709671 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-03-22 21:40:47.709706 | orchestrator | changed: [testbed-manager] => (item=adm) 2025-03-22 21:40:49.060048 | orchestrator | changed: [testbed-manager] => (item=sudo) 2025-03-22 21:40:49.060147 | orchestrator | 2025-03-22 21:40:49.060167 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-03-22 21:40:49.060198 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:50.883265 | orchestrator | 2025-03-22 21:40:50.883321 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-03-22 21:40:50.883343 | orchestrator | changed: [testbed-manager] => (item=export LANGUAGE=C.UTF-8) 2025-03-22 21:40:51.471365 | orchestrator | changed: [testbed-manager] => (item=export LANG=C.UTF-8) 2025-03-22 21:40:51.471486 | orchestrator | changed: [testbed-manager] => (item=export LC_ALL=C.UTF-8) 2025-03-22 21:40:51.471507 | orchestrator | 2025-03-22 21:40:51.471523 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-03-22 21:40:51.471553 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:51.536758 | orchestrator | 2025-03-22 21:40:51.536863 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-03-22 21:40:51.536897 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:40:52.414377 | orchestrator | 2025-03-22 21:40:52.414497 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-03-22 21:40:52.414532 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:40:52.452904 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:52.452972 | orchestrator | 2025-03-22 21:40:52.452989 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-03-22 21:40:52.453014 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:40:52.490304 | orchestrator | 2025-03-22 21:40:52.490359 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-03-22 21:40:52.490377 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:40:52.519660 | orchestrator | 2025-03-22 21:40:52.519725 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-03-22 21:40:52.519746 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:40:52.564704 | orchestrator | 2025-03-22 21:40:52.564756 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-03-22 21:40:52.564771 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:40:53.325072 | orchestrator | 2025-03-22 21:40:53.325157 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-03-22 21:40:53.325189 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:54.693146 | orchestrator | 2025-03-22 21:40:54.693248 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-03-22 21:40:54.693266 | orchestrator | 2025-03-22 21:40:54.693281 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:40:54.693310 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:55.752567 | orchestrator | 2025-03-22 21:40:55.752608 | orchestrator | TASK [Recursively change ownership of /opt/venv] ******************************* 2025-03-22 21:40:55.752620 | orchestrator | changed: [testbed-manager] 2025-03-22 21:40:55.848676 | orchestrator | 2025-03-22 21:40:55.848756 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:40:55.848769 | orchestrator | testbed-manager : ok=33 changed=23 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2025-03-22 21:40:55.848885 | orchestrator | 2025-03-22 21:40:55.949073 | orchestrator | changed 2025-03-22 21:40:55.967865 | 2025-03-22 21:40:55.967977 | TASK [Point out that the log in on the manager is now possible] 2025-03-22 21:40:56.015047 | orchestrator | ok: It is now already possible to log in to the manager with 'make login'. 2025-03-22 21:40:56.025227 | 2025-03-22 21:40:56.025346 | TASK [Point out that the following task takes some time and does not give any output] 2025-03-22 21:40:56.075107 | orchestrator | ok: The task 'Run manager part 1 + 2' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minuts for this task to complete. 2025-03-22 21:40:56.086125 | 2025-03-22 21:40:56.086235 | TASK [Run manager part 1 + 2] 2025-03-22 21:40:56.917848 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-03-22 21:40:56.972009 | orchestrator | 2025-03-22 21:40:59.489310 | orchestrator | PLAY [Run manager part 1] ****************************************************** 2025-03-22 21:40:59.489360 | orchestrator | 2025-03-22 21:40:59.489382 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:40:59.489398 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:59.527736 | orchestrator | 2025-03-22 21:40:59.527796 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-03-22 21:40:59.527820 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:40:59.572812 | orchestrator | 2025-03-22 21:40:59.572862 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-03-22 21:40:59.572880 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:59.615626 | orchestrator | 2025-03-22 21:40:59.615678 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-03-22 21:40:59.615697 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:59.673943 | orchestrator | 2025-03-22 21:40:59.673991 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-03-22 21:40:59.674006 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:59.737564 | orchestrator | 2025-03-22 21:40:59.737658 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-03-22 21:40:59.737698 | orchestrator | ok: [testbed-manager] 2025-03-22 21:40:59.785749 | orchestrator | 2025-03-22 21:40:59.785801 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-03-22 21:40:59.785815 | orchestrator | included: /home/zuul-testbed06/.ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager 2025-03-22 21:41:00.517429 | orchestrator | 2025-03-22 21:41:00.517499 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-03-22 21:41:00.517542 | orchestrator | ok: [testbed-manager] 2025-03-22 21:41:00.565965 | orchestrator | 2025-03-22 21:41:00.566046 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-03-22 21:41:00.566063 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:41:02.041779 | orchestrator | 2025-03-22 21:41:02.041904 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-03-22 21:41:02.041965 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:02.677641 | orchestrator | 2025-03-22 21:41:02.677718 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-03-22 21:41:02.677749 | orchestrator | ok: [testbed-manager] 2025-03-22 21:41:03.906166 | orchestrator | 2025-03-22 21:41:03.906217 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-03-22 21:41:03.906237 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:17.860881 | orchestrator | 2025-03-22 21:41:17.860966 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-03-22 21:41:17.860997 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:18.527698 | orchestrator | 2025-03-22 21:41:18.527741 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-03-22 21:41:18.527755 | orchestrator | ok: [testbed-manager] 2025-03-22 21:41:18.575030 | orchestrator | 2025-03-22 21:41:18.575069 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-03-22 21:41:18.575083 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:41:19.541773 | orchestrator | 2025-03-22 21:41:19.541822 | orchestrator | TASK [Copy SSH public key] ***************************************************** 2025-03-22 21:41:19.541840 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:20.547861 | orchestrator | 2025-03-22 21:41:20.547904 | orchestrator | TASK [Copy SSH private key] **************************************************** 2025-03-22 21:41:20.547921 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:21.146473 | orchestrator | 2025-03-22 21:41:21.146586 | orchestrator | TASK [Create configuration directory] ****************************************** 2025-03-22 21:41:21.146638 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:21.186308 | orchestrator | 2025-03-22 21:41:21.186353 | orchestrator | TASK [Copy testbed repo] ******************************************************* 2025-03-22 21:41:21.186369 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-03-22 21:41:23.382425 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-03-22 21:41:23.382490 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-03-22 21:41:23.382500 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-03-22 21:41:23.382516 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:33.568924 | orchestrator | 2025-03-22 21:41:33.569028 | orchestrator | TASK [Install python requirements in venv] ************************************* 2025-03-22 21:41:33.569061 | orchestrator | ok: [testbed-manager] => (item=Jinja2) 2025-03-22 21:41:34.661857 | orchestrator | ok: [testbed-manager] => (item=PyYAML) 2025-03-22 21:41:34.661945 | orchestrator | ok: [testbed-manager] => (item=packaging) 2025-03-22 21:41:34.661966 | orchestrator | changed: [testbed-manager] => (item=python-gilt==1.2.3) 2025-03-22 21:41:34.661982 | orchestrator | ok: [testbed-manager] => (item=requests>=2.32.2) 2025-03-22 21:41:34.661996 | orchestrator | ok: [testbed-manager] => (item=docker>=7.1.0) 2025-03-22 21:41:34.662010 | orchestrator | 2025-03-22 21:41:34.662060 | orchestrator | TASK [Copy testbed custom CA certificate on Debian/Ubuntu] ********************* 2025-03-22 21:41:34.662103 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:34.696804 | orchestrator | 2025-03-22 21:41:34.696873 | orchestrator | TASK [Copy testbed custom CA certificate on CentOS] **************************** 2025-03-22 21:41:34.696895 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:41:37.306624 | orchestrator | 2025-03-22 21:41:37.306728 | orchestrator | TASK [Run update-ca-certificates on Debian/Ubuntu] ***************************** 2025-03-22 21:41:37.306764 | orchestrator | changed: [testbed-manager] 2025-03-22 21:41:37.348868 | orchestrator | 2025-03-22 21:41:37.348951 | orchestrator | TASK [Run update-ca-trust on RedHat] ******************************************* 2025-03-22 21:41:37.348981 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:43:26.066584 | orchestrator | 2025-03-22 21:43:26.066657 | orchestrator | TASK [Run manager part 2] ****************************************************** 2025-03-22 21:43:26.066675 | orchestrator | changed: [testbed-manager] 2025-03-22 21:43:27.343383 | orchestrator | 2025-03-22 21:43:27.343483 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-03-22 21:43:27.343516 | orchestrator | ok: [testbed-manager] 2025-03-22 21:43:27.434602 | orchestrator | 2025-03-22 21:43:27.434822 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:43:27.434846 | orchestrator | testbed-manager : ok=21 changed=11 unreachable=0 failed=0 skipped=5 rescued=0 ignored=0 2025-03-22 21:43:27.434859 | orchestrator | 2025-03-22 21:43:27.722747 | orchestrator | changed 2025-03-22 21:43:27.746669 | 2025-03-22 21:43:27.746804 | TASK [Reboot manager] 2025-03-22 21:43:29.314098 | orchestrator | changed 2025-03-22 21:43:29.323939 | 2025-03-22 21:43:29.324055 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-03-22 21:43:45.702672 | orchestrator | ok 2025-03-22 21:43:45.711297 | 2025-03-22 21:43:45.711431 | TASK [Wait a little longer for the manager so that everything is ready] 2025-03-22 21:44:45.759837 | orchestrator | ok 2025-03-22 21:44:45.770836 | 2025-03-22 21:44:45.770960 | TASK [Deploy manager + bootstrap nodes] 2025-03-22 21:44:48.368281 | orchestrator | 2025-03-22 21:44:48.371506 | orchestrator | # DEPLOY MANAGER 2025-03-22 21:44:48.371556 | orchestrator | 2025-03-22 21:44:48.371575 | orchestrator | + set -e 2025-03-22 21:44:48.371625 | orchestrator | + echo 2025-03-22 21:44:48.371657 | orchestrator | + echo '# DEPLOY MANAGER' 2025-03-22 21:44:48.371686 | orchestrator | + echo 2025-03-22 21:44:48.371714 | orchestrator | + cat /opt/manager-vars.sh 2025-03-22 21:44:48.371758 | orchestrator | export NUMBER_OF_NODES=6 2025-03-22 21:44:48.372988 | orchestrator | 2025-03-22 21:44:48.373018 | orchestrator | export CEPH_VERSION=quincy 2025-03-22 21:44:48.373044 | orchestrator | export CONFIGURATION_VERSION=main 2025-03-22 21:44:48.373070 | orchestrator | export MANAGER_VERSION=8.1.0 2025-03-22 21:44:48.373095 | orchestrator | export OPENSTACK_VERSION=2024.1 2025-03-22 21:44:48.373121 | orchestrator | 2025-03-22 21:44:48.373148 | orchestrator | export ARA=false 2025-03-22 21:44:48.373174 | orchestrator | export TEMPEST=false 2025-03-22 21:44:48.373194 | orchestrator | export IS_ZUUL=true 2025-03-22 21:44:48.373208 | orchestrator | 2025-03-22 21:44:48.373224 | orchestrator | export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.83 2025-03-22 21:44:48.373246 | orchestrator | export EXTERNAL_API=false 2025-03-22 21:44:48.373261 | orchestrator | 2025-03-22 21:44:48.373275 | orchestrator | export IMAGE_USER=ubuntu 2025-03-22 21:44:48.373289 | orchestrator | export IMAGE_NODE_USER=ubuntu 2025-03-22 21:44:48.373304 | orchestrator | 2025-03-22 21:44:48.373318 | orchestrator | export CEPH_STACK=ceph-ansible 2025-03-22 21:44:48.373332 | orchestrator | 2025-03-22 21:44:48.373346 | orchestrator | + echo 2025-03-22 21:44:48.373360 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-03-22 21:44:48.373381 | orchestrator | ++ export INTERACTIVE=false 2025-03-22 21:44:48.423504 | orchestrator | ++ INTERACTIVE=false 2025-03-22 21:44:48.423551 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-03-22 21:44:48.423577 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-03-22 21:44:48.423593 | orchestrator | + source /opt/manager-vars.sh 2025-03-22 21:44:48.423607 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-03-22 21:44:48.423621 | orchestrator | ++ NUMBER_OF_NODES=6 2025-03-22 21:44:48.423635 | orchestrator | ++ export CEPH_VERSION=quincy 2025-03-22 21:44:48.423649 | orchestrator | ++ CEPH_VERSION=quincy 2025-03-22 21:44:48.423664 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-03-22 21:44:48.423679 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-03-22 21:44:48.423701 | orchestrator | ++ export MANAGER_VERSION=8.1.0 2025-03-22 21:44:48.423716 | orchestrator | ++ MANAGER_VERSION=8.1.0 2025-03-22 21:44:48.423730 | orchestrator | ++ export OPENSTACK_VERSION=2024.1 2025-03-22 21:44:48.423744 | orchestrator | ++ OPENSTACK_VERSION=2024.1 2025-03-22 21:44:48.423758 | orchestrator | ++ export ARA=false 2025-03-22 21:44:48.423773 | orchestrator | ++ ARA=false 2025-03-22 21:44:48.423787 | orchestrator | ++ export TEMPEST=false 2025-03-22 21:44:48.423801 | orchestrator | ++ TEMPEST=false 2025-03-22 21:44:48.423815 | orchestrator | ++ export IS_ZUUL=true 2025-03-22 21:44:48.423829 | orchestrator | ++ IS_ZUUL=true 2025-03-22 21:44:48.423843 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.83 2025-03-22 21:44:48.423857 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.83 2025-03-22 21:44:48.423878 | orchestrator | ++ export EXTERNAL_API=false 2025-03-22 21:44:48.423893 | orchestrator | ++ EXTERNAL_API=false 2025-03-22 21:44:48.423907 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-03-22 21:44:48.423921 | orchestrator | ++ IMAGE_USER=ubuntu 2025-03-22 21:44:48.423935 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-03-22 21:44:48.423950 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-03-22 21:44:48.423967 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-03-22 21:44:48.423981 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-03-22 21:44:48.423995 | orchestrator | + sudo ln -sf /opt/configuration/contrib/semver2.sh /usr/local/bin/semver 2025-03-22 21:44:48.424026 | orchestrator | + docker version 2025-03-22 21:44:48.711786 | orchestrator | Client: Docker Engine - Community 2025-03-22 21:44:48.713829 | orchestrator | Version: 26.1.4 2025-03-22 21:44:48.713870 | orchestrator | API version: 1.45 2025-03-22 21:44:48.713885 | orchestrator | Go version: go1.21.11 2025-03-22 21:44:48.713899 | orchestrator | Git commit: 5650f9b 2025-03-22 21:44:48.713913 | orchestrator | Built: Wed Jun 5 11:28:57 2024 2025-03-22 21:44:48.713928 | orchestrator | OS/Arch: linux/amd64 2025-03-22 21:44:48.713942 | orchestrator | Context: default 2025-03-22 21:44:48.713956 | orchestrator | 2025-03-22 21:44:48.713971 | orchestrator | Server: Docker Engine - Community 2025-03-22 21:44:48.713985 | orchestrator | Engine: 2025-03-22 21:44:48.713999 | orchestrator | Version: 26.1.4 2025-03-22 21:44:48.714092 | orchestrator | API version: 1.45 (minimum version 1.24) 2025-03-22 21:44:48.714111 | orchestrator | Go version: go1.21.11 2025-03-22 21:44:48.714129 | orchestrator | Git commit: de5c9cf 2025-03-22 21:44:48.714170 | orchestrator | Built: Wed Jun 5 11:28:57 2024 2025-03-22 21:44:48.714184 | orchestrator | OS/Arch: linux/amd64 2025-03-22 21:44:48.714198 | orchestrator | Experimental: false 2025-03-22 21:44:48.714212 | orchestrator | containerd: 2025-03-22 21:44:48.714226 | orchestrator | Version: 1.7.25 2025-03-22 21:44:48.714240 | orchestrator | GitCommit: bcc810d6b9066471b0b6fa75f557a15a1cbf31bb 2025-03-22 21:44:48.714254 | orchestrator | runc: 2025-03-22 21:44:48.714268 | orchestrator | Version: 1.2.4 2025-03-22 21:44:48.714282 | orchestrator | GitCommit: v1.2.4-0-g6c52b3f 2025-03-22 21:44:48.714296 | orchestrator | docker-init: 2025-03-22 21:44:48.714309 | orchestrator | Version: 0.19.0 2025-03-22 21:44:48.714324 | orchestrator | GitCommit: de40ad0 2025-03-22 21:44:48.714345 | orchestrator | + sh -c /opt/configuration/scripts/deploy/000-manager.sh 2025-03-22 21:44:48.722928 | orchestrator | + set -e 2025-03-22 21:44:48.727152 | orchestrator | + source /opt/manager-vars.sh 2025-03-22 21:44:48.727190 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-03-22 21:44:48.727205 | orchestrator | ++ NUMBER_OF_NODES=6 2025-03-22 21:44:48.727220 | orchestrator | ++ export CEPH_VERSION=quincy 2025-03-22 21:44:48.727234 | orchestrator | ++ CEPH_VERSION=quincy 2025-03-22 21:44:48.727248 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-03-22 21:44:48.727263 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-03-22 21:44:48.727278 | orchestrator | ++ export MANAGER_VERSION=8.1.0 2025-03-22 21:44:48.727292 | orchestrator | ++ MANAGER_VERSION=8.1.0 2025-03-22 21:44:48.727306 | orchestrator | ++ export OPENSTACK_VERSION=2024.1 2025-03-22 21:44:48.727321 | orchestrator | ++ OPENSTACK_VERSION=2024.1 2025-03-22 21:44:48.727335 | orchestrator | ++ export ARA=false 2025-03-22 21:44:48.727349 | orchestrator | ++ ARA=false 2025-03-22 21:44:48.727362 | orchestrator | ++ export TEMPEST=false 2025-03-22 21:44:48.727377 | orchestrator | ++ TEMPEST=false 2025-03-22 21:44:48.727390 | orchestrator | ++ export IS_ZUUL=true 2025-03-22 21:44:48.727546 | orchestrator | ++ IS_ZUUL=true 2025-03-22 21:44:48.727584 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.83 2025-03-22 21:44:48.727599 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.83 2025-03-22 21:44:48.727613 | orchestrator | ++ export EXTERNAL_API=false 2025-03-22 21:44:48.727643 | orchestrator | ++ EXTERNAL_API=false 2025-03-22 21:44:48.727657 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-03-22 21:44:48.727671 | orchestrator | ++ IMAGE_USER=ubuntu 2025-03-22 21:44:48.727686 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-03-22 21:44:48.727700 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-03-22 21:44:48.727714 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-03-22 21:44:48.727728 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-03-22 21:44:48.727747 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-03-22 21:44:48.727767 | orchestrator | ++ export INTERACTIVE=false 2025-03-22 21:44:48.727781 | orchestrator | ++ INTERACTIVE=false 2025-03-22 21:44:48.727795 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-03-22 21:44:48.727809 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-03-22 21:44:48.727823 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-22 21:44:48.727839 | orchestrator | + /opt/configuration/scripts/set-manager-version.sh 8.1.0 2025-03-22 21:44:48.727865 | orchestrator | + set -e 2025-03-22 21:44:48.732772 | orchestrator | + VERSION=8.1.0 2025-03-22 21:44:48.732803 | orchestrator | + sed -i 's/manager_version: .*/manager_version: 8.1.0/g' /opt/configuration/environments/manager/configuration.yml 2025-03-22 21:44:48.732832 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-22 21:44:48.738491 | orchestrator | + sed -i /ceph_version:/d /opt/configuration/environments/manager/configuration.yml 2025-03-22 21:44:48.738531 | orchestrator | + sed -i /openstack_version:/d /opt/configuration/environments/manager/configuration.yml 2025-03-22 21:44:48.741398 | orchestrator | + sh -c /opt/configuration/scripts/sync-configuration-repository.sh 2025-03-22 21:44:48.752260 | orchestrator | /opt/configuration ~ 2025-03-22 21:44:48.755373 | orchestrator | + set -e 2025-03-22 21:44:48.755425 | orchestrator | + pushd /opt/configuration 2025-03-22 21:44:48.755442 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-22 21:44:48.755462 | orchestrator | + source /opt/venv/bin/activate 2025-03-22 21:44:48.756651 | orchestrator | ++ deactivate nondestructive 2025-03-22 21:44:48.756940 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:48.757029 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:48.757046 | orchestrator | ++ hash -r 2025-03-22 21:44:48.757060 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:48.757104 | orchestrator | ++ unset VIRTUAL_ENV 2025-03-22 21:44:48.757150 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2025-03-22 21:44:48.757165 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2025-03-22 21:44:48.757266 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2025-03-22 21:44:48.757366 | orchestrator | ++ '[' linux-gnu = msys ']' 2025-03-22 21:44:48.757395 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2025-03-22 21:44:48.757520 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2025-03-22 21:44:48.757539 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-22 21:44:48.757554 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-22 21:44:48.757569 | orchestrator | ++ export PATH 2025-03-22 21:44:48.757583 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:48.757602 | orchestrator | ++ '[' -z '' ']' 2025-03-22 21:44:48.757733 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2025-03-22 21:44:48.757754 | orchestrator | ++ PS1='(venv) ' 2025-03-22 21:44:48.757769 | orchestrator | ++ export PS1 2025-03-22 21:44:48.757787 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2025-03-22 21:44:48.757815 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2025-03-22 21:44:48.757830 | orchestrator | ++ hash -r 2025-03-22 21:44:48.757848 | orchestrator | + pip3 install --no-cache-dir python-gilt==1.2.3 requests Jinja2 PyYAML packaging 2025-03-22 21:44:50.011301 | orchestrator | Requirement already satisfied: python-gilt==1.2.3 in /opt/venv/lib/python3.12/site-packages (1.2.3) 2025-03-22 21:44:50.012140 | orchestrator | Requirement already satisfied: requests in /opt/venv/lib/python3.12/site-packages (2.32.3) 2025-03-22 21:44:50.013789 | orchestrator | Requirement already satisfied: Jinja2 in /opt/venv/lib/python3.12/site-packages (3.1.6) 2025-03-22 21:44:50.015319 | orchestrator | Requirement already satisfied: PyYAML in /opt/venv/lib/python3.12/site-packages (6.0.2) 2025-03-22 21:44:50.017068 | orchestrator | Requirement already satisfied: packaging in /opt/venv/lib/python3.12/site-packages (24.2) 2025-03-22 21:44:50.030279 | orchestrator | Requirement already satisfied: click in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (8.1.8) 2025-03-22 21:44:50.031656 | orchestrator | Requirement already satisfied: colorama in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.4.6) 2025-03-22 21:44:50.032787 | orchestrator | Requirement already satisfied: fasteners in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (0.19) 2025-03-22 21:44:50.034137 | orchestrator | Requirement already satisfied: sh in /opt/venv/lib/python3.12/site-packages (from python-gilt==1.2.3) (2.2.2) 2025-03-22 21:44:50.076753 | orchestrator | Requirement already satisfied: charset-normalizer<4,>=2 in /opt/venv/lib/python3.12/site-packages (from requests) (3.4.1) 2025-03-22 21:44:50.078426 | orchestrator | Requirement already satisfied: idna<4,>=2.5 in /opt/venv/lib/python3.12/site-packages (from requests) (3.10) 2025-03-22 21:44:50.079874 | orchestrator | Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/venv/lib/python3.12/site-packages (from requests) (2.3.0) 2025-03-22 21:44:50.081552 | orchestrator | Requirement already satisfied: certifi>=2017.4.17 in /opt/venv/lib/python3.12/site-packages (from requests) (2025.1.31) 2025-03-22 21:44:50.086124 | orchestrator | Requirement already satisfied: MarkupSafe>=2.0 in /opt/venv/lib/python3.12/site-packages (from Jinja2) (3.0.2) 2025-03-22 21:44:50.317304 | orchestrator | ++ which gilt 2025-03-22 21:44:50.323227 | orchestrator | + GILT=/opt/venv/bin/gilt 2025-03-22 21:44:50.613240 | orchestrator | + /opt/venv/bin/gilt overlay 2025-03-22 21:44:50.613301 | orchestrator | osism.cfg-generics: 2025-03-22 21:44:52.117240 | orchestrator | - cloning osism.cfg-generics to /home/dragon/.gilt/clone/github.com/osism.cfg-generics 2025-03-22 21:44:52.117381 | orchestrator | - copied (main) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/environments/manager/images.yml to /opt/configuration/environments/manager/ 2025-03-22 21:44:53.121126 | orchestrator | - copied (main) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/render-images.py to /opt/configuration/environments/manager/ 2025-03-22 21:44:53.121229 | orchestrator | - copied (main) /home/dragon/.gilt/clone/github.com/osism.cfg-generics/src/set-versions.py to /opt/configuration/environments/ 2025-03-22 21:44:53.121247 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh render-images` in /opt/configuration/environments/manager/ 2025-03-22 21:44:53.121281 | orchestrator | - running `rm render-images.py` in /opt/configuration/environments/manager/ 2025-03-22 21:44:53.134290 | orchestrator | - running `/opt/configuration/scripts/wrapper-gilt.sh set-versions` in /opt/configuration/environments/ 2025-03-22 21:44:53.533362 | orchestrator | - running `rm set-versions.py` in /opt/configuration/environments/ 2025-03-22 21:44:53.601378 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-22 21:44:53.603202 | orchestrator | + deactivate 2025-03-22 21:44:53.603323 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2025-03-22 21:44:53.603341 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-22 21:44:53.603351 | orchestrator | + export PATH 2025-03-22 21:44:53.603361 | orchestrator | + unset _OLD_VIRTUAL_PATH 2025-03-22 21:44:53.603371 | orchestrator | + '[' -n '' ']' 2025-03-22 21:44:53.603381 | orchestrator | + hash -r 2025-03-22 21:44:53.603390 | orchestrator | + '[' -n '' ']' 2025-03-22 21:44:53.603423 | orchestrator | + unset VIRTUAL_ENV 2025-03-22 21:44:53.603433 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2025-03-22 21:44:53.603443 | orchestrator | + '[' '!' '' = nondestructive ']' 2025-03-22 21:44:53.603453 | orchestrator | + unset -f deactivate 2025-03-22 21:44:53.603467 | orchestrator | + popd 2025-03-22 21:44:53.603477 | orchestrator | ~ 2025-03-22 21:44:53.603502 | orchestrator | + [[ 8.1.0 == \l\a\t\e\s\t ]] 2025-03-22 21:44:53.603693 | orchestrator | + [[ ceph-ansible == \r\o\o\k ]] 2025-03-22 21:44:53.603712 | orchestrator | ++ semver 8.1.0 7.0.0 2025-03-22 21:44:53.657872 | orchestrator | + [[ 1 -ge 0 ]] 2025-03-22 21:44:53.696806 | orchestrator | + echo 'enable_osism_kubernetes: true' 2025-03-22 21:44:53.696852 | orchestrator | + /opt/configuration/scripts/enable-resource-nodes.sh 2025-03-22 21:44:53.696875 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-22 21:44:53.697191 | orchestrator | + source /opt/venv/bin/activate 2025-03-22 21:44:53.697210 | orchestrator | ++ deactivate nondestructive 2025-03-22 21:44:53.697362 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:53.697386 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:53.697430 | orchestrator | ++ hash -r 2025-03-22 21:44:53.697444 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:53.697456 | orchestrator | ++ unset VIRTUAL_ENV 2025-03-22 21:44:53.697471 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2025-03-22 21:44:53.697567 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2025-03-22 21:44:53.697585 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2025-03-22 21:44:53.697646 | orchestrator | ++ '[' linux-gnu = msys ']' 2025-03-22 21:44:53.697821 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2025-03-22 21:44:53.697899 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2025-03-22 21:44:53.697913 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-22 21:44:53.697928 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-22 21:44:53.697998 | orchestrator | ++ export PATH 2025-03-22 21:44:53.698168 | orchestrator | ++ '[' -n '' ']' 2025-03-22 21:44:53.698264 | orchestrator | ++ '[' -z '' ']' 2025-03-22 21:44:53.698315 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2025-03-22 21:44:53.698352 | orchestrator | ++ PS1='(venv) ' 2025-03-22 21:44:53.698513 | orchestrator | ++ export PS1 2025-03-22 21:44:53.698531 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2025-03-22 21:44:53.698777 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2025-03-22 21:44:53.698794 | orchestrator | ++ hash -r 2025-03-22 21:44:53.698810 | orchestrator | + ansible-playbook -i testbed-manager, --vault-password-file /opt/configuration/environments/.vault_pass /opt/configuration/ansible/manager-part-3.yml 2025-03-22 21:44:55.141215 | orchestrator | 2025-03-22 21:44:55.809563 | orchestrator | PLAY [Copy custom facts] ******************************************************* 2025-03-22 21:44:55.809689 | orchestrator | 2025-03-22 21:44:55.809710 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-03-22 21:44:55.809743 | orchestrator | ok: [testbed-manager] 2025-03-22 21:44:56.936719 | orchestrator | 2025-03-22 21:44:56.936778 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-03-22 21:44:56.936791 | orchestrator | changed: [testbed-manager] 2025-03-22 21:44:59.583258 | orchestrator | 2025-03-22 21:44:59.583376 | orchestrator | PLAY [Before the deployment of the manager] ************************************ 2025-03-22 21:44:59.583431 | orchestrator | 2025-03-22 21:44:59.583449 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:44:59.583480 | orchestrator | ok: [testbed-manager] 2025-03-22 21:45:06.089574 | orchestrator | 2025-03-22 21:45:06.089701 | orchestrator | TASK [Pull images] ************************************************************* 2025-03-22 21:45:06.089769 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/ara-server:1.7.2) 2025-03-22 21:46:02.564914 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/mariadb:11.6.2) 2025-03-22 21:46:02.565087 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/ceph-ansible:8.1.0) 2025-03-22 21:46:02.565118 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/inventory-reconciler:8.1.0) 2025-03-22 21:46:02.565143 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/kolla-ansible:8.1.0) 2025-03-22 21:46:02.565168 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/redis:7.4.1-alpine) 2025-03-22 21:46:02.565192 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/netbox:v4.1.7) 2025-03-22 21:46:02.565215 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/osism-ansible:8.1.0) 2025-03-22 21:46:02.565238 | orchestrator | changed: [testbed-manager] => (item=registry.osism.tech/osism/osism:0.20241219.2) 2025-03-22 21:46:02.565272 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/postgres:16.6-alpine) 2025-03-22 21:46:02.565297 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/library/traefik:v3.2.1) 2025-03-22 21:46:02.565320 | orchestrator | changed: [testbed-manager] => (item=index.docker.io/hashicorp/vault:1.18.2) 2025-03-22 21:46:02.565343 | orchestrator | 2025-03-22 21:46:02.565366 | orchestrator | TASK [Check status] ************************************************************ 2025-03-22 21:46:02.565477 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (120 retries left). 2025-03-22 21:46:02.627766 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (119 retries left). 2025-03-22 21:46:02.627826 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j203114268523.1563', 'results_file': '/home/dragon/.ansible_async/j203114268523.1563', 'changed': True, 'item': 'registry.osism.tech/osism/ara-server:1.7.2', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.627856 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j785944598083.1588', 'results_file': '/home/dragon/.ansible_async/j785944598083.1588', 'changed': True, 'item': 'index.docker.io/library/mariadb:11.6.2', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.627871 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (120 retries left). 2025-03-22 21:46:02.627889 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (119 retries left). 2025-03-22 21:46:02.627904 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j145698449727.1613', 'results_file': '/home/dragon/.ansible_async/j145698449727.1613', 'changed': True, 'item': 'registry.osism.tech/osism/ceph-ansible:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.627926 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j584996508291.1645', 'results_file': '/home/dragon/.ansible_async/j584996508291.1645', 'changed': True, 'item': 'registry.osism.tech/osism/inventory-reconciler:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.627945 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j7706616672.1681', 'results_file': '/home/dragon/.ansible_async/j7706616672.1681', 'changed': True, 'item': 'registry.osism.tech/osism/kolla-ansible:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.627960 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j245841269628.1713', 'results_file': '/home/dragon/.ansible_async/j245841269628.1713', 'changed': True, 'item': 'index.docker.io/library/redis:7.4.1-alpine', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.627974 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check status (120 retries left). 2025-03-22 21:46:02.627988 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j399618828161.1754', 'results_file': '/home/dragon/.ansible_async/j399618828161.1754', 'changed': True, 'item': 'registry.osism.tech/osism/netbox:v4.1.7', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.628030 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j443738109488.1779', 'results_file': '/home/dragon/.ansible_async/j443738109488.1779', 'changed': True, 'item': 'registry.osism.tech/osism/osism-ansible:8.1.0', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.628045 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j947642648165.1811', 'results_file': '/home/dragon/.ansible_async/j947642648165.1811', 'changed': True, 'item': 'registry.osism.tech/osism/osism:0.20241219.2', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.628060 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j934283326038.1843', 'results_file': '/home/dragon/.ansible_async/j934283326038.1843', 'changed': True, 'item': 'index.docker.io/library/postgres:16.6-alpine', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.628074 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j597875110609.1870', 'results_file': '/home/dragon/.ansible_async/j597875110609.1870', 'changed': True, 'item': 'index.docker.io/library/traefik:v3.2.1', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.628088 | orchestrator | changed: [testbed-manager] => (item={'failed': 0, 'started': 1, 'finished': 0, 'ansible_job_id': 'j149251130402.1909', 'results_file': '/home/dragon/.ansible_async/j149251130402.1909', 'changed': True, 'item': 'index.docker.io/hashicorp/vault:1.18.2', 'ansible_loop_var': 'item'}) 2025-03-22 21:46:02.628102 | orchestrator | 2025-03-22 21:46:02.628117 | orchestrator | TASK [Get /opt/manager-vars.sh] ************************************************ 2025-03-22 21:46:02.628142 | orchestrator | ok: [testbed-manager] 2025-03-22 21:46:03.193121 | orchestrator | 2025-03-22 21:46:03.193243 | orchestrator | TASK [Add ara_server_mariadb_volume_type parameter] **************************** 2025-03-22 21:46:03.193280 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:03.565391 | orchestrator | 2025-03-22 21:46:03.565507 | orchestrator | TASK [Add netbox_postgres_volume_type parameter] ******************************* 2025-03-22 21:46:03.565536 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:03.917996 | orchestrator | 2025-03-22 21:46:03.918112 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-03-22 21:46:03.918141 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:03.976207 | orchestrator | 2025-03-22 21:46:03.976247 | orchestrator | TASK [Use insecure glance configuration] *************************************** 2025-03-22 21:46:03.976271 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:46:04.362450 | orchestrator | 2025-03-22 21:46:04.362569 | orchestrator | TASK [Check if /etc/OTC_region exist] ****************************************** 2025-03-22 21:46:04.362605 | orchestrator | ok: [testbed-manager] 2025-03-22 21:46:04.530538 | orchestrator | 2025-03-22 21:46:04.530650 | orchestrator | TASK [Add nova_compute_virt_type parameter] ************************************ 2025-03-22 21:46:04.530696 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:46:06.693688 | orchestrator | 2025-03-22 21:46:06.693788 | orchestrator | PLAY [Apply role traefik & netbox] ********************************************* 2025-03-22 21:46:06.693802 | orchestrator | 2025-03-22 21:46:06.693814 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:46:06.693838 | orchestrator | ok: [testbed-manager] 2025-03-22 21:46:06.946378 | orchestrator | 2025-03-22 21:46:06.946484 | orchestrator | TASK [Apply traefik role] ****************************************************** 2025-03-22 21:46:06.946514 | orchestrator | 2025-03-22 21:46:07.062646 | orchestrator | TASK [osism.services.traefik : Include config tasks] *************************** 2025-03-22 21:46:07.062741 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/config.yml for testbed-manager 2025-03-22 21:46:08.276584 | orchestrator | 2025-03-22 21:46:08.276724 | orchestrator | TASK [osism.services.traefik : Create required directories] ******************** 2025-03-22 21:46:08.276758 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik) 2025-03-22 21:46:10.385875 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/certificates) 2025-03-22 21:46:10.386068 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/configuration) 2025-03-22 21:46:10.386090 | orchestrator | 2025-03-22 21:46:10.386105 | orchestrator | TASK [osism.services.traefik : Copy configuration files] *********************** 2025-03-22 21:46:10.386137 | orchestrator | changed: [testbed-manager] => (item=traefik.yml) 2025-03-22 21:46:11.106924 | orchestrator | changed: [testbed-manager] => (item=traefik.env) 2025-03-22 21:46:11.107040 | orchestrator | changed: [testbed-manager] => (item=certificates.yml) 2025-03-22 21:46:11.107059 | orchestrator | 2025-03-22 21:46:11.107075 | orchestrator | TASK [osism.services.traefik : Copy certificate cert files] ******************** 2025-03-22 21:46:11.107107 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:46:11.802713 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:11.802812 | orchestrator | 2025-03-22 21:46:11.802832 | orchestrator | TASK [osism.services.traefik : Copy certificate key files] ********************* 2025-03-22 21:46:11.802861 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:46:11.879861 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:11.879927 | orchestrator | 2025-03-22 21:46:11.879943 | orchestrator | TASK [osism.services.traefik : Copy dynamic configuration] ********************* 2025-03-22 21:46:11.879969 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:46:12.291087 | orchestrator | 2025-03-22 21:46:12.291186 | orchestrator | TASK [osism.services.traefik : Remove dynamic configuration] ******************* 2025-03-22 21:46:12.291220 | orchestrator | ok: [testbed-manager] 2025-03-22 21:46:12.396636 | orchestrator | 2025-03-22 21:46:12.396697 | orchestrator | TASK [osism.services.traefik : Include service tasks] ************************** 2025-03-22 21:46:12.396726 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/service.yml for testbed-manager 2025-03-22 21:46:13.514112 | orchestrator | 2025-03-22 21:46:13.514211 | orchestrator | TASK [osism.services.traefik : Create traefik external network] **************** 2025-03-22 21:46:13.514241 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:14.474557 | orchestrator | 2025-03-22 21:46:14.474653 | orchestrator | TASK [osism.services.traefik : Copy docker-compose.yml file] ******************* 2025-03-22 21:46:14.474685 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:17.709141 | orchestrator | 2025-03-22 21:46:17.709256 | orchestrator | TASK [osism.services.traefik : Manage traefik service] ************************* 2025-03-22 21:46:17.709289 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:18.039362 | orchestrator | 2025-03-22 21:46:18.039490 | orchestrator | TASK [Apply netbox role] ******************************************************* 2025-03-22 21:46:18.039521 | orchestrator | 2025-03-22 21:46:18.189064 | orchestrator | TASK [osism.services.netbox : Include install tasks] *************************** 2025-03-22 21:46:18.189125 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/install-Debian-family.yml for testbed-manager 2025-03-22 21:46:21.229622 | orchestrator | 2025-03-22 21:46:21.229752 | orchestrator | TASK [osism.services.netbox : Install required packages] *********************** 2025-03-22 21:46:21.229789 | orchestrator | ok: [testbed-manager] 2025-03-22 21:46:21.390297 | orchestrator | 2025-03-22 21:46:21.390425 | orchestrator | TASK [osism.services.netbox : Include config tasks] **************************** 2025-03-22 21:46:21.390456 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/config.yml for testbed-manager 2025-03-22 21:46:22.617731 | orchestrator | 2025-03-22 21:46:22.617839 | orchestrator | TASK [osism.services.netbox : Create required directories] ********************* 2025-03-22 21:46:22.617875 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox) 2025-03-22 21:46:22.730558 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/configuration) 2025-03-22 21:46:22.730643 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/secrets) 2025-03-22 21:46:22.730658 | orchestrator | 2025-03-22 21:46:22.730674 | orchestrator | TASK [osism.services.netbox : Include postgres config tasks] ******************* 2025-03-22 21:46:22.730702 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/config-postgres.yml for testbed-manager 2025-03-22 21:46:23.495036 | orchestrator | 2025-03-22 21:46:23.495153 | orchestrator | TASK [osism.services.netbox : Copy postgres environment files] ***************** 2025-03-22 21:46:23.495190 | orchestrator | changed: [testbed-manager] => (item=postgres) 2025-03-22 21:46:24.231847 | orchestrator | 2025-03-22 21:46:24.231981 | orchestrator | TASK [osism.services.netbox : Copy secret files] ******************************* 2025-03-22 21:46:24.232029 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:46:24.673733 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:24.673842 | orchestrator | 2025-03-22 21:46:24.673861 | orchestrator | TASK [osism.services.netbox : Create docker-entrypoint-initdb.d directory] ***** 2025-03-22 21:46:24.673892 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:25.043594 | orchestrator | 2025-03-22 21:46:25.043717 | orchestrator | TASK [osism.services.netbox : Check if init.sql file exists] ******************* 2025-03-22 21:46:25.043754 | orchestrator | ok: [testbed-manager] 2025-03-22 21:46:25.136393 | orchestrator | 2025-03-22 21:46:25.136492 | orchestrator | TASK [osism.services.netbox : Copy init.sql file] ****************************** 2025-03-22 21:46:25.136510 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:46:25.870983 | orchestrator | 2025-03-22 21:46:25.871099 | orchestrator | TASK [osism.services.netbox : Create init-netbox-database.sh script] *********** 2025-03-22 21:46:25.871135 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:25.990816 | orchestrator | 2025-03-22 21:46:25.990905 | orchestrator | TASK [osism.services.netbox : Include config tasks] **************************** 2025-03-22 21:46:25.990929 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/config-netbox.yml for testbed-manager 2025-03-22 21:46:26.848121 | orchestrator | 2025-03-22 21:46:26.848246 | orchestrator | TASK [osism.services.netbox : Create directories required by netbox] *********** 2025-03-22 21:46:26.848282 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/configuration/initializers) 2025-03-22 21:46:27.575661 | orchestrator | changed: [testbed-manager] => (item=/opt/netbox/configuration/startup-scripts) 2025-03-22 21:46:27.575760 | orchestrator | 2025-03-22 21:46:27.575777 | orchestrator | TASK [osism.services.netbox : Copy netbox environment files] ******************* 2025-03-22 21:46:27.575806 | orchestrator | changed: [testbed-manager] => (item=netbox) 2025-03-22 21:46:28.366177 | orchestrator | 2025-03-22 21:46:28.366282 | orchestrator | TASK [osism.services.netbox : Copy netbox configuration file] ****************** 2025-03-22 21:46:28.366314 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:28.442673 | orchestrator | 2025-03-22 21:46:28.442724 | orchestrator | TASK [osism.services.netbox : Copy nginx unit configuration file (<= 1.26)] **** 2025-03-22 21:46:28.442753 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:46:29.133805 | orchestrator | 2025-03-22 21:46:29.133896 | orchestrator | TASK [osism.services.netbox : Copy nginx unit configuration file (> 1.26)] ***** 2025-03-22 21:46:29.133928 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:31.075035 | orchestrator | 2025-03-22 21:46:31.075136 | orchestrator | TASK [osism.services.netbox : Copy secret files] ******************************* 2025-03-22 21:46:31.075169 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:46:37.716133 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:46:37.716251 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:46:37.716269 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:37.716286 | orchestrator | 2025-03-22 21:46:37.716299 | orchestrator | TASK [osism.services.netbox : Deploy initializers for netbox] ****************** 2025-03-22 21:46:37.716329 | orchestrator | changed: [testbed-manager] => (item=custom_fields) 2025-03-22 21:46:38.472002 | orchestrator | changed: [testbed-manager] => (item=device_roles) 2025-03-22 21:46:38.472113 | orchestrator | changed: [testbed-manager] => (item=device_types) 2025-03-22 21:46:38.472129 | orchestrator | changed: [testbed-manager] => (item=groups) 2025-03-22 21:46:38.472144 | orchestrator | changed: [testbed-manager] => (item=manufacturers) 2025-03-22 21:46:38.472157 | orchestrator | changed: [testbed-manager] => (item=object_permissions) 2025-03-22 21:46:38.472171 | orchestrator | changed: [testbed-manager] => (item=prefix_vlan_roles) 2025-03-22 21:46:38.472183 | orchestrator | changed: [testbed-manager] => (item=sites) 2025-03-22 21:46:38.472196 | orchestrator | changed: [testbed-manager] => (item=tags) 2025-03-22 21:46:38.472209 | orchestrator | changed: [testbed-manager] => (item=users) 2025-03-22 21:46:38.472222 | orchestrator | 2025-03-22 21:46:38.472236 | orchestrator | TASK [osism.services.netbox : Deploy startup scripts for netbox] *************** 2025-03-22 21:46:38.472266 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/files/startup-scripts/270_tags.py) 2025-03-22 21:46:38.661689 | orchestrator | 2025-03-22 21:46:38.661763 | orchestrator | TASK [osism.services.netbox : Include service tasks] *************************** 2025-03-22 21:46:38.661790 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/service.yml for testbed-manager 2025-03-22 21:46:39.504709 | orchestrator | 2025-03-22 21:46:39.504804 | orchestrator | TASK [osism.services.netbox : Copy netbox systemd unit file] ******************* 2025-03-22 21:46:39.504835 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:40.235846 | orchestrator | 2025-03-22 21:46:40.235981 | orchestrator | TASK [osism.services.netbox : Create traefik external network] ***************** 2025-03-22 21:46:40.236016 | orchestrator | ok: [testbed-manager] 2025-03-22 21:46:41.110868 | orchestrator | 2025-03-22 21:46:41.110988 | orchestrator | TASK [osism.services.netbox : Copy docker-compose.yml file] ******************** 2025-03-22 21:46:41.111025 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:45.644374 | orchestrator | 2025-03-22 21:46:45.644534 | orchestrator | TASK [osism.services.netbox : Pull container images] *************************** 2025-03-22 21:46:45.644573 | orchestrator | changed: [testbed-manager] 2025-03-22 21:46:46.674808 | orchestrator | 2025-03-22 21:46:46.674938 | orchestrator | TASK [osism.services.netbox : Stop and disable old service docker-compose@netbox] *** 2025-03-22 21:46:46.674987 | orchestrator | ok: [testbed-manager] 2025-03-22 21:47:09.054455 | orchestrator | 2025-03-22 21:47:09.054596 | orchestrator | TASK [osism.services.netbox : Manage netbox service] *************************** 2025-03-22 21:47:09.054631 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage netbox service (10 retries left). 2025-03-22 21:47:09.143433 | orchestrator | ok: [testbed-manager] 2025-03-22 21:47:09.143485 | orchestrator | 2025-03-22 21:47:09.143501 | orchestrator | TASK [osism.services.netbox : Register that netbox service was started] ******** 2025-03-22 21:47:09.143524 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:09.201599 | orchestrator | 2025-03-22 21:47:09.201655 | orchestrator | TASK [osism.services.netbox : Flush handlers] ********************************** 2025-03-22 21:47:09.201671 | orchestrator | 2025-03-22 21:47:09.201687 | orchestrator | RUNNING HANDLER [osism.services.traefik : Restart traefik service] ************* 2025-03-22 21:47:09.201710 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:09.312645 | orchestrator | 2025-03-22 21:47:09.312696 | orchestrator | RUNNING HANDLER [osism.services.netbox : Restart netbox service] *************** 2025-03-22 21:47:09.312721 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/restart-service.yml for testbed-manager 2025-03-22 21:47:10.295123 | orchestrator | 2025-03-22 21:47:10.295237 | orchestrator | RUNNING HANDLER [osism.services.netbox : Get infos on postgres container] ****** 2025-03-22 21:47:10.295271 | orchestrator | ok: [testbed-manager] 2025-03-22 21:47:10.398194 | orchestrator | 2025-03-22 21:47:10.398265 | orchestrator | RUNNING HANDLER [osism.services.netbox : Set postgres container version fact] *** 2025-03-22 21:47:10.398294 | orchestrator | ok: [testbed-manager] 2025-03-22 21:47:10.465135 | orchestrator | 2025-03-22 21:47:10.465194 | orchestrator | RUNNING HANDLER [osism.services.netbox : Print major version of postgres container] *** 2025-03-22 21:47:10.465223 | orchestrator | ok: [testbed-manager] => { 2025-03-22 21:47:11.312608 | orchestrator | "msg": "The major version of the running postgres container is 16" 2025-03-22 21:47:11.312737 | orchestrator | } 2025-03-22 21:47:11.312756 | orchestrator | 2025-03-22 21:47:11.312774 | orchestrator | RUNNING HANDLER [osism.services.netbox : Pull postgres image] ****************** 2025-03-22 21:47:11.312806 | orchestrator | ok: [testbed-manager] 2025-03-22 21:47:12.424496 | orchestrator | 2025-03-22 21:47:12.424613 | orchestrator | RUNNING HANDLER [osism.services.netbox : Get infos on postgres image] ********** 2025-03-22 21:47:12.424650 | orchestrator | ok: [testbed-manager] 2025-03-22 21:47:12.527172 | orchestrator | 2025-03-22 21:47:12.527236 | orchestrator | RUNNING HANDLER [osism.services.netbox : Set postgres image version fact] ****** 2025-03-22 21:47:12.527264 | orchestrator | ok: [testbed-manager] 2025-03-22 21:47:12.601093 | orchestrator | 2025-03-22 21:47:12.601131 | orchestrator | RUNNING HANDLER [osism.services.netbox : Print major version of postgres image] *** 2025-03-22 21:47:12.601155 | orchestrator | ok: [testbed-manager] => { 2025-03-22 21:47:12.672654 | orchestrator | "msg": "The major version of the postgres image is 16" 2025-03-22 21:47:12.672687 | orchestrator | } 2025-03-22 21:47:12.672731 | orchestrator | 2025-03-22 21:47:12.672746 | orchestrator | RUNNING HANDLER [osism.services.netbox : Stop netbox service] ****************** 2025-03-22 21:47:12.672780 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:12.751312 | orchestrator | 2025-03-22 21:47:12.751367 | orchestrator | RUNNING HANDLER [osism.services.netbox : Wait for netbox service to stop] ****** 2025-03-22 21:47:12.751393 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:12.838582 | orchestrator | 2025-03-22 21:47:12.838627 | orchestrator | RUNNING HANDLER [osism.services.netbox : Get infos on postgres volume] ********* 2025-03-22 21:47:12.838651 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:12.916290 | orchestrator | 2025-03-22 21:47:12.916333 | orchestrator | RUNNING HANDLER [osism.services.netbox : Upgrade postgres database] ************ 2025-03-22 21:47:12.916356 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:12.999220 | orchestrator | 2025-03-22 21:47:12.999254 | orchestrator | RUNNING HANDLER [osism.services.netbox : Remove netbox-pgautoupgrade container] *** 2025-03-22 21:47:12.999276 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:13.084706 | orchestrator | 2025-03-22 21:47:13.084743 | orchestrator | RUNNING HANDLER [osism.services.netbox : Start netbox service] ***************** 2025-03-22 21:47:13.084765 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:47:14.592154 | orchestrator | 2025-03-22 21:47:14.592275 | orchestrator | RUNNING HANDLER [osism.services.netbox : Restart netbox service] *************** 2025-03-22 21:47:14.592315 | orchestrator | changed: [testbed-manager] 2025-03-22 21:47:14.720368 | orchestrator | 2025-03-22 21:47:14.720446 | orchestrator | RUNNING HANDLER [osism.services.netbox : Register that netbox service was started] *** 2025-03-22 21:47:14.720475 | orchestrator | ok: [testbed-manager] 2025-03-22 21:48:14.793670 | orchestrator | 2025-03-22 21:48:14.793804 | orchestrator | RUNNING HANDLER [osism.services.netbox : Wait for netbox service to start] ***** 2025-03-22 21:48:14.793842 | orchestrator | Pausing for 60 seconds 2025-03-22 21:48:14.898116 | orchestrator | changed: [testbed-manager] 2025-03-22 21:48:14.898208 | orchestrator | 2025-03-22 21:48:14.898228 | orchestrator | RUNNING HANDLER [osism.services.netbox : Wait for an healthy netbox service] *** 2025-03-22 21:48:14.898259 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netbox/tasks/wait-for-healthy-service.yml for testbed-manager 2025-03-22 21:53:32.588275 | orchestrator | 2025-03-22 21:53:32.588411 | orchestrator | RUNNING HANDLER [osism.services.netbox : Check that all containers are in a good state] *** 2025-03-22 21:53:32.588450 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (60 retries left). 2025-03-22 21:53:36.097694 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (59 retries left). 2025-03-22 21:53:36.097808 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (58 retries left). 2025-03-22 21:53:36.097825 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (57 retries left). 2025-03-22 21:53:36.097843 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (56 retries left). 2025-03-22 21:53:36.097858 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (55 retries left). 2025-03-22 21:53:36.097928 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (54 retries left). 2025-03-22 21:53:36.097944 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (53 retries left). 2025-03-22 21:53:36.097959 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (52 retries left). 2025-03-22 21:53:36.097973 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (51 retries left). 2025-03-22 21:53:36.097987 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (50 retries left). 2025-03-22 21:53:36.098001 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (49 retries left). 2025-03-22 21:53:36.098069 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (48 retries left). 2025-03-22 21:53:36.098086 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (47 retries left). 2025-03-22 21:53:36.098131 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (46 retries left). 2025-03-22 21:53:36.098146 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (45 retries left). 2025-03-22 21:53:36.098160 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (44 retries left). 2025-03-22 21:53:36.098174 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (43 retries left). 2025-03-22 21:53:36.098188 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (42 retries left). 2025-03-22 21:53:36.098214 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (41 retries left). 2025-03-22 21:53:36.098229 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (40 retries left). 2025-03-22 21:53:36.098243 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (39 retries left). 2025-03-22 21:53:36.098259 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (38 retries left). 2025-03-22 21:53:36.098275 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (37 retries left). 2025-03-22 21:53:36.098290 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (36 retries left). 2025-03-22 21:53:36.098305 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (35 retries left). 2025-03-22 21:53:36.098321 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (34 retries left). 2025-03-22 21:53:36.098336 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (33 retries left). 2025-03-22 21:53:36.098352 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (32 retries left). 2025-03-22 21:53:36.098367 | orchestrator | FAILED - RETRYING: [testbed-manager]: Check that all containers are in a good state (31 retries left). 2025-03-22 21:53:36.098383 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:36.098399 | orchestrator | 2025-03-22 21:53:36.098416 | orchestrator | PLAY [Deploy manager service] ************************************************** 2025-03-22 21:53:36.098432 | orchestrator | 2025-03-22 21:53:36.098448 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:53:36.098479 | orchestrator | ok: [testbed-manager] 2025-03-22 21:53:36.246386 | orchestrator | 2025-03-22 21:53:36.246487 | orchestrator | TASK [Apply manager role] ****************************************************** 2025-03-22 21:53:36.246521 | orchestrator | 2025-03-22 21:53:36.326399 | orchestrator | TASK [osism.services.manager : Include install tasks] ************************** 2025-03-22 21:53:36.326437 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/install-Debian-family.yml for testbed-manager 2025-03-22 21:53:38.454405 | orchestrator | 2025-03-22 21:53:38.454532 | orchestrator | TASK [osism.services.manager : Install required packages] ********************** 2025-03-22 21:53:38.454568 | orchestrator | ok: [testbed-manager] 2025-03-22 21:53:38.520740 | orchestrator | 2025-03-22 21:53:38.520819 | orchestrator | TASK [osism.services.manager : Gather variables for each operating system] ***** 2025-03-22 21:53:38.520904 | orchestrator | ok: [testbed-manager] 2025-03-22 21:53:38.671502 | orchestrator | 2025-03-22 21:53:38.671600 | orchestrator | TASK [osism.services.manager : Include config tasks] *************************** 2025-03-22 21:53:38.671635 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config.yml for testbed-manager 2025-03-22 21:53:41.801058 | orchestrator | 2025-03-22 21:53:41.801168 | orchestrator | TASK [osism.services.manager : Create required directories] ******************** 2025-03-22 21:53:41.801204 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible) 2025-03-22 21:53:42.564977 | orchestrator | changed: [testbed-manager] => (item=/opt/archive) 2025-03-22 21:53:42.565075 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/configuration) 2025-03-22 21:53:42.565093 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/data) 2025-03-22 21:53:42.565137 | orchestrator | ok: [testbed-manager] => (item=/opt/manager) 2025-03-22 21:53:42.565153 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/secrets) 2025-03-22 21:53:42.565168 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible/secrets) 2025-03-22 21:53:42.565182 | orchestrator | changed: [testbed-manager] => (item=/opt/state) 2025-03-22 21:53:42.565196 | orchestrator | 2025-03-22 21:53:42.565211 | orchestrator | TASK [osism.services.manager : Copy client environment file] ******************* 2025-03-22 21:53:42.565240 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:42.657139 | orchestrator | 2025-03-22 21:53:42.657182 | orchestrator | TASK [osism.services.manager : Include ara config tasks] *********************** 2025-03-22 21:53:42.657207 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ara.yml for testbed-manager 2025-03-22 21:53:44.178818 | orchestrator | 2025-03-22 21:53:44.178964 | orchestrator | TASK [osism.services.manager : Copy ARA environment files] ********************* 2025-03-22 21:53:44.178998 | orchestrator | changed: [testbed-manager] => (item=ara) 2025-03-22 21:53:44.933420 | orchestrator | changed: [testbed-manager] => (item=ara-server) 2025-03-22 21:53:44.933525 | orchestrator | 2025-03-22 21:53:44.933543 | orchestrator | TASK [osism.services.manager : Copy MariaDB environment file] ****************** 2025-03-22 21:53:44.933573 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:44.990232 | orchestrator | 2025-03-22 21:53:44.990343 | orchestrator | TASK [osism.services.manager : Include vault config tasks] ********************* 2025-03-22 21:53:44.990379 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:53:45.079972 | orchestrator | 2025-03-22 21:53:45.080071 | orchestrator | TASK [osism.services.manager : Include ansible config tasks] ******************* 2025-03-22 21:53:45.080104 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ansible.yml for testbed-manager 2025-03-22 21:53:46.616794 | orchestrator | 2025-03-22 21:53:46.616942 | orchestrator | TASK [osism.services.manager : Copy private ssh keys] ************************** 2025-03-22 21:53:46.616978 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:53:47.328463 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:53:47.328559 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:47.328578 | orchestrator | 2025-03-22 21:53:47.328594 | orchestrator | TASK [osism.services.manager : Copy ansible environment file] ****************** 2025-03-22 21:53:47.328622 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:47.423576 | orchestrator | 2025-03-22 21:53:47.423618 | orchestrator | TASK [osism.services.manager : Include netbox config tasks] ******************** 2025-03-22 21:53:47.423643 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-netbox.yml for testbed-manager 2025-03-22 21:53:48.207161 | orchestrator | 2025-03-22 21:53:48.207258 | orchestrator | TASK [osism.services.manager : Copy secret files] ****************************** 2025-03-22 21:53:48.207288 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 21:53:48.926558 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:48.926700 | orchestrator | 2025-03-22 21:53:48.926722 | orchestrator | TASK [osism.services.manager : Copy netbox environment file] ******************* 2025-03-22 21:53:48.926755 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:49.052739 | orchestrator | 2025-03-22 21:53:49.052838 | orchestrator | TASK [osism.services.manager : Include celery config tasks] ******************** 2025-03-22 21:53:49.052934 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-celery.yml for testbed-manager 2025-03-22 21:53:49.805071 | orchestrator | 2025-03-22 21:53:49.805198 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_watches] **************** 2025-03-22 21:53:49.805235 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:50.256116 | orchestrator | 2025-03-22 21:53:50.256223 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_instances] ************** 2025-03-22 21:53:50.256269 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:51.653501 | orchestrator | 2025-03-22 21:53:51.653618 | orchestrator | TASK [osism.services.manager : Copy celery environment files] ****************** 2025-03-22 21:53:51.653656 | orchestrator | changed: [testbed-manager] => (item=conductor) 2025-03-22 21:53:52.395597 | orchestrator | changed: [testbed-manager] => (item=openstack) 2025-03-22 21:53:52.395734 | orchestrator | 2025-03-22 21:53:52.395753 | orchestrator | TASK [osism.services.manager : Copy listener environment file] ***************** 2025-03-22 21:53:52.395784 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:52.767481 | orchestrator | 2025-03-22 21:53:52.767569 | orchestrator | TASK [osism.services.manager : Check for conductor.yml] ************************ 2025-03-22 21:53:52.767599 | orchestrator | ok: [testbed-manager] 2025-03-22 21:53:52.817140 | orchestrator | 2025-03-22 21:53:52.817190 | orchestrator | TASK [osism.services.manager : Copy conductor configuration file] ************** 2025-03-22 21:53:52.817215 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:53:53.584649 | orchestrator | 2025-03-22 21:53:53.584745 | orchestrator | TASK [osism.services.manager : Copy empty conductor configuration file] ******** 2025-03-22 21:53:53.584775 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:53.671511 | orchestrator | 2025-03-22 21:53:53.671546 | orchestrator | TASK [osism.services.manager : Include wrapper config tasks] ******************* 2025-03-22 21:53:53.671568 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-wrapper.yml for testbed-manager 2025-03-22 21:53:53.731200 | orchestrator | 2025-03-22 21:53:53.731233 | orchestrator | TASK [osism.services.manager : Include wrapper vars file] ********************** 2025-03-22 21:53:53.731254 | orchestrator | ok: [testbed-manager] 2025-03-22 21:53:56.070444 | orchestrator | 2025-03-22 21:53:56.070564 | orchestrator | TASK [osism.services.manager : Copy wrapper scripts] *************************** 2025-03-22 21:53:56.070598 | orchestrator | changed: [testbed-manager] => (item=osism) 2025-03-22 21:53:56.908949 | orchestrator | changed: [testbed-manager] => (item=osism-update-docker) 2025-03-22 21:53:56.909066 | orchestrator | changed: [testbed-manager] => (item=osism-update-manager) 2025-03-22 21:53:56.909083 | orchestrator | 2025-03-22 21:53:56.909099 | orchestrator | TASK [osism.services.manager : Copy cilium wrapper script] ********************* 2025-03-22 21:53:56.909129 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:56.981311 | orchestrator | 2025-03-22 21:53:56.981371 | orchestrator | TASK [osism.services.manager : Include scripts config tasks] ******************* 2025-03-22 21:53:56.981398 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-scripts.yml for testbed-manager 2025-03-22 21:53:57.034963 | orchestrator | 2025-03-22 21:53:57.035005 | orchestrator | TASK [osism.services.manager : Include scripts vars file] ********************** 2025-03-22 21:53:57.035030 | orchestrator | ok: [testbed-manager] 2025-03-22 21:53:57.804316 | orchestrator | 2025-03-22 21:53:57.804421 | orchestrator | TASK [osism.services.manager : Copy scripts] *********************************** 2025-03-22 21:53:57.804454 | orchestrator | changed: [testbed-manager] => (item=osism-include) 2025-03-22 21:53:57.893788 | orchestrator | 2025-03-22 21:53:57.893826 | orchestrator | TASK [osism.services.manager : Include service tasks] ************************** 2025-03-22 21:53:57.893879 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/service.yml for testbed-manager 2025-03-22 21:53:58.804157 | orchestrator | 2025-03-22 21:53:58.804271 | orchestrator | TASK [osism.services.manager : Copy manager systemd unit file] ***************** 2025-03-22 21:53:58.804304 | orchestrator | changed: [testbed-manager] 2025-03-22 21:53:59.589036 | orchestrator | 2025-03-22 21:53:59.589147 | orchestrator | TASK [osism.services.manager : Create traefik external network] **************** 2025-03-22 21:53:59.589180 | orchestrator | ok: [testbed-manager] 2025-03-22 21:53:59.636868 | orchestrator | 2025-03-22 21:53:59.636913 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb < 11.0.0] *** 2025-03-22 21:53:59.636941 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:53:59.697508 | orchestrator | 2025-03-22 21:53:59.697550 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb >= 11.0.0] *** 2025-03-22 21:53:59.697571 | orchestrator | ok: [testbed-manager] 2025-03-22 21:54:00.669642 | orchestrator | 2025-03-22 21:54:00.669745 | orchestrator | TASK [osism.services.manager : Copy docker-compose.yml file] ******************* 2025-03-22 21:54:00.669773 | orchestrator | changed: [testbed-manager] 2025-03-22 21:54:24.211988 | orchestrator | 2025-03-22 21:54:24.212124 | orchestrator | TASK [osism.services.manager : Pull container images] ************************** 2025-03-22 21:54:24.212161 | orchestrator | changed: [testbed-manager] 2025-03-22 21:54:24.930748 | orchestrator | 2025-03-22 21:54:24.930891 | orchestrator | TASK [osism.services.manager : Stop and disable old service docker-compose@manager] *** 2025-03-22 21:54:24.930926 | orchestrator | ok: [testbed-manager] 2025-03-22 21:54:27.505714 | orchestrator | 2025-03-22 21:54:27.505886 | orchestrator | TASK [osism.services.manager : Manage manager service] ************************* 2025-03-22 21:54:27.505924 | orchestrator | changed: [testbed-manager] 2025-03-22 21:54:27.588034 | orchestrator | 2025-03-22 21:54:27.588095 | orchestrator | TASK [osism.services.manager : Register that manager service was started] ****** 2025-03-22 21:54:27.588123 | orchestrator | ok: [testbed-manager] 2025-03-22 21:54:27.651867 | orchestrator | 2025-03-22 21:54:27.651902 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-03-22 21:54:27.651917 | orchestrator | 2025-03-22 21:54:27.651932 | orchestrator | RUNNING HANDLER [osism.services.manager : Restart manager service] ************* 2025-03-22 21:54:27.651953 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:55:27.707834 | orchestrator | 2025-03-22 21:55:27.707955 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for manager service to start] *** 2025-03-22 21:55:27.707991 | orchestrator | Pausing for 60 seconds 2025-03-22 21:55:35.456018 | orchestrator | changed: [testbed-manager] 2025-03-22 21:55:35.456161 | orchestrator | 2025-03-22 21:55:35.456185 | orchestrator | RUNNING HANDLER [osism.services.manager : Ensure that all containers are up] *** 2025-03-22 21:55:35.456218 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:17.529445 | orchestrator | 2025-03-22 21:56:17.529620 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for an healthy manager service] *** 2025-03-22 21:56:17.529660 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (50 retries left). 2025-03-22 21:56:24.275769 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (49 retries left). 2025-03-22 21:56:24.275935 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:24.275956 | orchestrator | 2025-03-22 21:56:24.275972 | orchestrator | RUNNING HANDLER [osism.services.manager : Copy osismclient bash completion script] *** 2025-03-22 21:56:24.276007 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:24.375291 | orchestrator | 2025-03-22 21:56:24.375396 | orchestrator | TASK [osism.services.manager : Include initialize tasks] *********************** 2025-03-22 21:56:24.375433 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/initialize.yml for testbed-manager 2025-03-22 21:56:24.432982 | orchestrator | 2025-03-22 21:56:24.433101 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-03-22 21:56:24.433132 | orchestrator | 2025-03-22 21:56:24.433165 | orchestrator | TASK [osism.services.manager : Include vault initialize tasks] ***************** 2025-03-22 21:56:24.433198 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:56:24.545357 | orchestrator | 2025-03-22 21:56:24.545410 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:56:24.545428 | orchestrator | testbed-manager : ok=103 changed=55 unreachable=0 failed=0 skipped=18 rescued=0 ignored=0 2025-03-22 21:56:24.545443 | orchestrator | 2025-03-22 21:56:24.545470 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-03-22 21:56:24.550577 | orchestrator | + deactivate 2025-03-22 21:56:24.550610 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2025-03-22 21:56:24.550627 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-03-22 21:56:24.550642 | orchestrator | + export PATH 2025-03-22 21:56:24.550657 | orchestrator | + unset _OLD_VIRTUAL_PATH 2025-03-22 21:56:24.550671 | orchestrator | + '[' -n '' ']' 2025-03-22 21:56:24.550728 | orchestrator | + hash -r 2025-03-22 21:56:24.550744 | orchestrator | + '[' -n '' ']' 2025-03-22 21:56:24.550759 | orchestrator | + unset VIRTUAL_ENV 2025-03-22 21:56:24.550773 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2025-03-22 21:56:24.550788 | orchestrator | + '[' '!' '' = nondestructive ']' 2025-03-22 21:56:24.550885 | orchestrator | + unset -f deactivate 2025-03-22 21:56:24.550909 | orchestrator | + cp /home/dragon/.ssh/id_rsa.pub /opt/ansible/secrets/id_rsa.operator.pub 2025-03-22 21:56:24.550932 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-03-22 21:56:24.571949 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-03-22 21:56:24.571986 | orchestrator | + local max_attempts=60 2025-03-22 21:56:24.572041 | orchestrator | + local name=ceph-ansible 2025-03-22 21:56:24.572056 | orchestrator | + local attempt_num=1 2025-03-22 21:56:24.572079 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-03-22 21:56:24.572102 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-22 21:56:24.572782 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-03-22 21:56:24.572809 | orchestrator | + local max_attempts=60 2025-03-22 21:56:24.572827 | orchestrator | + local name=kolla-ansible 2025-03-22 21:56:24.572844 | orchestrator | + local attempt_num=1 2025-03-22 21:56:24.572867 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-03-22 21:56:24.598934 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-22 21:56:24.598966 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-03-22 21:56:24.598987 | orchestrator | + local max_attempts=60 2025-03-22 21:56:24.600102 | orchestrator | + local name=osism-ansible 2025-03-22 21:56:24.600128 | orchestrator | + local attempt_num=1 2025-03-22 21:56:24.600149 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-03-22 21:56:24.627749 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-22 21:56:25.769435 | orchestrator | + [[ true == \t\r\u\e ]] 2025-03-22 21:56:25.769564 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-03-22 21:56:25.769602 | orchestrator | ++ semver 8.1.0 9.0.0 2025-03-22 21:56:25.814231 | orchestrator | + [[ -1 -ge 0 ]] 2025-03-22 21:56:26.090214 | orchestrator | + [[ 8.1.0 == \l\a\t\e\s\t ]] 2025-03-22 21:56:26.090326 | orchestrator | + docker compose --project-directory /opt/manager ps 2025-03-22 21:56:26.090362 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2025-03-22 21:56:26.096144 | orchestrator | ceph-ansible registry.osism.tech/osism/ceph-ansible:8.1.0 "/entrypoint.sh osis…" ceph-ansible About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096176 | orchestrator | kolla-ansible registry.osism.tech/osism/kolla-ansible:8.1.0 "/entrypoint.sh osis…" kolla-ansible About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096191 | orchestrator | manager-api-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" api About a minute ago Up About a minute (healthy) 192.168.16.5:8000->8000/tcp 2025-03-22 21:56:26.096228 | orchestrator | manager-ara-server-1 registry.osism.tech/osism/ara-server:1.7.2 "sh -c '/wait && /ru…" ara-server About a minute ago Up About a minute (healthy) 8000/tcp 2025-03-22 21:56:26.096244 | orchestrator | manager-beat-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" beat About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096263 | orchestrator | manager-conductor-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" conductor About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096277 | orchestrator | manager-flower-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" flower About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096291 | orchestrator | manager-inventory_reconciler-1 registry.osism.tech/osism/inventory-reconciler:8.1.0 "/sbin/tini -- /entr…" inventory_reconciler About a minute ago Up 50 seconds (healthy) 2025-03-22 21:56:26.096305 | orchestrator | manager-listener-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" listener About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096319 | orchestrator | manager-mariadb-1 index.docker.io/library/mariadb:11.6.2 "docker-entrypoint.s…" mariadb About a minute ago Up About a minute (healthy) 3306/tcp 2025-03-22 21:56:26.096333 | orchestrator | manager-netbox-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" netbox About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096383 | orchestrator | manager-openstack-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" openstack About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096398 | orchestrator | manager-redis-1 index.docker.io/library/redis:7.4.1-alpine "docker-entrypoint.s…" redis About a minute ago Up About a minute (healthy) 6379/tcp 2025-03-22 21:56:26.096412 | orchestrator | manager-watchdog-1 registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- os…" watchdog About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096426 | orchestrator | osism-ansible registry.osism.tech/osism/osism-ansible:8.1.0 "/entrypoint.sh osis…" osism-ansible About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096440 | orchestrator | osism-kubernetes registry.osism.tech/osism/osism-kubernetes:8.1.0 "/entrypoint.sh osis…" osism-kubernetes About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096454 | orchestrator | osismclient registry.osism.tech/osism/osism:0.20241219.2 "/usr/bin/tini -- sl…" osismclient About a minute ago Up About a minute (healthy) 2025-03-22 21:56:26.096476 | orchestrator | + docker compose --project-directory /opt/netbox ps 2025-03-22 21:56:26.275056 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2025-03-22 21:56:26.281905 | orchestrator | netbox-netbox-1 registry.osism.tech/osism/netbox:v4.1.7 "/usr/bin/tini -- /o…" netbox 9 minutes ago Up 8 minutes (healthy) 2025-03-22 21:56:26.281942 | orchestrator | netbox-netbox-worker-1 registry.osism.tech/osism/netbox:v4.1.7 "/opt/netbox/venv/bi…" netbox-worker 9 minutes ago Up 3 minutes (healthy) 2025-03-22 21:56:26.281958 | orchestrator | netbox-postgres-1 index.docker.io/library/postgres:16.6-alpine "docker-entrypoint.s…" postgres 9 minutes ago Up 9 minutes (healthy) 5432/tcp 2025-03-22 21:56:26.281973 | orchestrator | netbox-redis-1 index.docker.io/library/redis:7.4.2-alpine "docker-entrypoint.s…" redis 9 minutes ago Up 9 minutes (healthy) 6379/tcp 2025-03-22 21:56:26.281994 | orchestrator | ++ semver 8.1.0 7.0.0 2025-03-22 21:56:26.337525 | orchestrator | + [[ 1 -ge 0 ]] 2025-03-22 21:56:26.343258 | orchestrator | + sed -i s/community.general.yaml/osism.commons.still_alive/ /opt/configuration/environments/ansible.cfg 2025-03-22 21:56:26.343296 | orchestrator | + osism apply resolvconf -l testbed-manager 2025-03-22 21:56:28.087012 | orchestrator | 2025-03-22 21:56:28 | INFO  | Task c0573743-1fd7-4d44-9163-dabd57367fe7 (resolvconf) was prepared for execution. 2025-03-22 21:56:31.691976 | orchestrator | 2025-03-22 21:56:28 | INFO  | It takes a moment until task c0573743-1fd7-4d44-9163-dabd57367fe7 (resolvconf) has been started and output is visible here. 2025-03-22 21:56:31.692137 | orchestrator | 2025-03-22 21:56:31.695191 | orchestrator | PLAY [Apply role resolvconf] *************************************************** 2025-03-22 21:56:31.695227 | orchestrator | 2025-03-22 21:56:31.695474 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:56:31.695501 | orchestrator | Saturday 22 March 2025 21:56:31 +0000 (0:00:00.109) 0:00:00.110 ******** 2025-03-22 21:56:36.428997 | orchestrator | ok: [testbed-manager] 2025-03-22 21:56:36.430226 | orchestrator | 2025-03-22 21:56:36.430281 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-03-22 21:56:36.430510 | orchestrator | Saturday 22 March 2025 21:56:36 +0000 (0:00:04.740) 0:00:04.850 ******** 2025-03-22 21:56:36.493393 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:56:36.494235 | orchestrator | 2025-03-22 21:56:36.494399 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-03-22 21:56:36.494877 | orchestrator | Saturday 22 March 2025 21:56:36 +0000 (0:00:00.066) 0:00:04.917 ******** 2025-03-22 21:56:36.591404 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager 2025-03-22 21:56:36.591928 | orchestrator | 2025-03-22 21:56:36.593507 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-03-22 21:56:36.593968 | orchestrator | Saturday 22 March 2025 21:56:36 +0000 (0:00:00.097) 0:00:05.014 ******** 2025-03-22 21:56:36.685787 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager 2025-03-22 21:56:36.689812 | orchestrator | 2025-03-22 21:56:36.690476 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-03-22 21:56:36.690507 | orchestrator | Saturday 22 March 2025 21:56:36 +0000 (0:00:00.090) 0:00:05.105 ******** 2025-03-22 21:56:37.953613 | orchestrator | ok: [testbed-manager] 2025-03-22 21:56:37.953868 | orchestrator | 2025-03-22 21:56:37.954301 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-03-22 21:56:37.954623 | orchestrator | Saturday 22 March 2025 21:56:37 +0000 (0:00:01.270) 0:00:06.376 ******** 2025-03-22 21:56:38.017843 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:56:38.018191 | orchestrator | 2025-03-22 21:56:38.018501 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-03-22 21:56:38.019034 | orchestrator | Saturday 22 March 2025 21:56:38 +0000 (0:00:00.065) 0:00:06.441 ******** 2025-03-22 21:56:38.552646 | orchestrator | ok: [testbed-manager] 2025-03-22 21:56:38.553218 | orchestrator | 2025-03-22 21:56:38.553342 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-03-22 21:56:38.619824 | orchestrator | Saturday 22 March 2025 21:56:38 +0000 (0:00:00.533) 0:00:06.975 ******** 2025-03-22 21:56:38.619857 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:56:38.620428 | orchestrator | 2025-03-22 21:56:38.621389 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-03-22 21:56:38.622264 | orchestrator | Saturday 22 March 2025 21:56:38 +0000 (0:00:00.068) 0:00:07.043 ******** 2025-03-22 21:56:39.286427 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:39.286995 | orchestrator | 2025-03-22 21:56:39.287035 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-03-22 21:56:39.287800 | orchestrator | Saturday 22 March 2025 21:56:39 +0000 (0:00:00.666) 0:00:07.709 ******** 2025-03-22 21:56:40.494343 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:40.494584 | orchestrator | 2025-03-22 21:56:40.495164 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-03-22 21:56:40.496941 | orchestrator | Saturday 22 March 2025 21:56:40 +0000 (0:00:01.206) 0:00:08.915 ******** 2025-03-22 21:56:41.558972 | orchestrator | ok: [testbed-manager] 2025-03-22 21:56:41.559931 | orchestrator | 2025-03-22 21:56:41.559979 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-03-22 21:56:41.560738 | orchestrator | Saturday 22 March 2025 21:56:41 +0000 (0:00:01.062) 0:00:09.978 ******** 2025-03-22 21:56:41.640160 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager 2025-03-22 21:56:41.641159 | orchestrator | 2025-03-22 21:56:41.642475 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-03-22 21:56:41.643256 | orchestrator | Saturday 22 March 2025 21:56:41 +0000 (0:00:00.085) 0:00:10.063 ******** 2025-03-22 21:56:42.961240 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:42.963231 | orchestrator | 2025-03-22 21:56:42.963281 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:56:42.963844 | orchestrator | 2025-03-22 21:56:42 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 21:56:42.963876 | orchestrator | 2025-03-22 21:56:42 | INFO  | Please wait and do not abort execution. 2025-03-22 21:56:42.965039 | orchestrator | testbed-manager : ok=10  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 21:56:42.965749 | orchestrator | 2025-03-22 21:56:42.966751 | orchestrator | Saturday 22 March 2025 21:56:42 +0000 (0:00:01.318) 0:00:11.382 ******** 2025-03-22 21:56:42.968039 | orchestrator | =============================================================================== 2025-03-22 21:56:42.968492 | orchestrator | Gathering Facts --------------------------------------------------------- 4.74s 2025-03-22 21:56:42.969421 | orchestrator | osism.commons.resolvconf : Restart systemd-resolved service ------------- 1.32s 2025-03-22 21:56:42.970418 | orchestrator | osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf --- 1.27s 2025-03-22 21:56:42.971381 | orchestrator | osism.commons.resolvconf : Copy configuration files --------------------- 1.21s 2025-03-22 21:56:42.971795 | orchestrator | osism.commons.resolvconf : Start/enable systemd-resolved service -------- 1.06s 2025-03-22 21:56:42.972496 | orchestrator | osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf --- 0.67s 2025-03-22 21:56:42.973497 | orchestrator | osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf ----- 0.53s 2025-03-22 21:56:42.973780 | orchestrator | osism.commons.resolvconf : Include resolvconf tasks --------------------- 0.10s 2025-03-22 21:56:42.974687 | orchestrator | osism.commons.resolvconf : Include distribution specific installation tasks --- 0.09s 2025-03-22 21:56:42.975133 | orchestrator | osism.commons.resolvconf : Include distribution specific configuration tasks --- 0.09s 2025-03-22 21:56:42.977439 | orchestrator | osism.commons.resolvconf : Archive existing file /etc/resolv.conf ------- 0.07s 2025-03-22 21:56:42.977602 | orchestrator | osism.commons.resolvconf : Check minimum and maximum number of name servers --- 0.07s 2025-03-22 21:56:42.978810 | orchestrator | osism.commons.resolvconf : Install package systemd-resolved ------------- 0.07s 2025-03-22 21:56:43.497324 | orchestrator | + osism apply sshconfig 2025-03-22 21:56:45.274163 | orchestrator | 2025-03-22 21:56:45 | INFO  | Task 27769a56-2b43-4640-8df8-db05f3e3073c (sshconfig) was prepared for execution. 2025-03-22 21:56:48.995913 | orchestrator | 2025-03-22 21:56:45 | INFO  | It takes a moment until task 27769a56-2b43-4640-8df8-db05f3e3073c (sshconfig) has been started and output is visible here. 2025-03-22 21:56:48.996074 | orchestrator | 2025-03-22 21:56:48.996294 | orchestrator | PLAY [Apply role sshconfig] **************************************************** 2025-03-22 21:56:48.997078 | orchestrator | 2025-03-22 21:56:48.998679 | orchestrator | TASK [osism.commons.sshconfig : Get home directory of operator user] *********** 2025-03-22 21:56:49.696783 | orchestrator | Saturday 22 March 2025 21:56:48 +0000 (0:00:00.135) 0:00:00.135 ******** 2025-03-22 21:56:49.696903 | orchestrator | ok: [testbed-manager] 2025-03-22 21:56:49.697065 | orchestrator | 2025-03-22 21:56:49.697869 | orchestrator | TASK [osism.commons.sshconfig : Ensure .ssh/config.d exist] ******************** 2025-03-22 21:56:49.698348 | orchestrator | Saturday 22 March 2025 21:56:49 +0000 (0:00:00.703) 0:00:00.839 ******** 2025-03-22 21:56:50.245089 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:50.246066 | orchestrator | 2025-03-22 21:56:50.246109 | orchestrator | TASK [osism.commons.sshconfig : Ensure config for each host exist] ************* 2025-03-22 21:56:50.246252 | orchestrator | Saturday 22 March 2025 21:56:50 +0000 (0:00:00.546) 0:00:01.386 ******** 2025-03-22 21:56:56.685822 | orchestrator | changed: [testbed-manager] => (item=testbed-manager) 2025-03-22 21:56:56.685978 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3) 2025-03-22 21:56:56.686462 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4) 2025-03-22 21:56:56.687680 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5) 2025-03-22 21:56:56.689494 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0) 2025-03-22 21:56:56.690246 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1) 2025-03-22 21:56:56.690369 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2) 2025-03-22 21:56:56.690557 | orchestrator | 2025-03-22 21:56:56.690850 | orchestrator | TASK [osism.commons.sshconfig : Add extra config] ****************************** 2025-03-22 21:56:56.691309 | orchestrator | Saturday 22 March 2025 21:56:56 +0000 (0:00:06.439) 0:00:07.825 ******** 2025-03-22 21:56:56.767107 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:56:56.767552 | orchestrator | 2025-03-22 21:56:56.768975 | orchestrator | TASK [osism.commons.sshconfig : Assemble ssh config] *************************** 2025-03-22 21:56:56.769677 | orchestrator | Saturday 22 March 2025 21:56:56 +0000 (0:00:00.084) 0:00:07.909 ******** 2025-03-22 21:56:57.384174 | orchestrator | changed: [testbed-manager] 2025-03-22 21:56:57.384538 | orchestrator | 2025-03-22 21:56:57.384578 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:56:57.385013 | orchestrator | 2025-03-22 21:56:57 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 21:56:57.385540 | orchestrator | 2025-03-22 21:56:57 | INFO  | Please wait and do not abort execution. 2025-03-22 21:56:57.385571 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 21:56:57.386001 | orchestrator | 2025-03-22 21:56:57.387589 | orchestrator | Saturday 22 March 2025 21:56:57 +0000 (0:00:00.615) 0:00:08.525 ******** 2025-03-22 21:56:57.387725 | orchestrator | =============================================================================== 2025-03-22 21:56:57.388060 | orchestrator | osism.commons.sshconfig : Ensure config for each host exist ------------- 6.44s 2025-03-22 21:56:57.389478 | orchestrator | osism.commons.sshconfig : Get home directory of operator user ----------- 0.70s 2025-03-22 21:56:57.390518 | orchestrator | osism.commons.sshconfig : Assemble ssh config --------------------------- 0.62s 2025-03-22 21:56:57.390854 | orchestrator | osism.commons.sshconfig : Ensure .ssh/config.d exist -------------------- 0.55s 2025-03-22 21:56:57.390889 | orchestrator | osism.commons.sshconfig : Add extra config ------------------------------ 0.08s 2025-03-22 21:56:57.893964 | orchestrator | + osism apply known-hosts 2025-03-22 21:56:59.548513 | orchestrator | 2025-03-22 21:56:59 | INFO  | Task e50678c5-7435-4b62-9e5b-1ff541f686b9 (known-hosts) was prepared for execution. 2025-03-22 21:57:03.154778 | orchestrator | 2025-03-22 21:56:59 | INFO  | It takes a moment until task e50678c5-7435-4b62-9e5b-1ff541f686b9 (known-hosts) has been started and output is visible here. 2025-03-22 21:57:03.154881 | orchestrator | 2025-03-22 21:57:03.156097 | orchestrator | PLAY [Apply role known_hosts] ************************************************** 2025-03-22 21:57:03.156141 | orchestrator | 2025-03-22 21:57:03.157920 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname] *** 2025-03-22 21:57:03.159284 | orchestrator | Saturday 22 March 2025 21:57:03 +0000 (0:00:00.145) 0:00:00.145 ******** 2025-03-22 21:57:09.309632 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-03-22 21:57:09.309871 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-03-22 21:57:09.310196 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-03-22 21:57:09.310767 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-03-22 21:57:09.310901 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-03-22 21:57:09.311732 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-03-22 21:57:09.312083 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-03-22 21:57:09.313704 | orchestrator | 2025-03-22 21:57:09.313951 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname] *** 2025-03-22 21:57:09.314730 | orchestrator | Saturday 22 March 2025 21:57:09 +0000 (0:00:06.155) 0:00:06.301 ******** 2025-03-22 21:57:09.490064 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-03-22 21:57:09.491175 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-03-22 21:57:09.491429 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-03-22 21:57:09.492020 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-03-22 21:57:09.492787 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-03-22 21:57:09.493465 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-03-22 21:57:09.494362 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-03-22 21:57:09.495415 | orchestrator | 2025-03-22 21:57:09.495735 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:09.495769 | orchestrator | Saturday 22 March 2025 21:57:09 +0000 (0:00:00.182) 0:00:06.483 ******** 2025-03-22 21:57:10.870916 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDFB+7AELg1igpIxa/UwIO7YGtJWrttHpLLvd4oc4nBcSde2r8SUkj6cPI6GE4xoTF3Kh8GF+B19KxNyOItfsWWCinlRDmc1hjc2sRiKSVSpAoUMD7l0UDxpZp+ZdfagUuWZMobzTaKfR6KcqMZRCQWrVLQ2NyBKA0ZRdNNLQKrIuadDNhYzGTorVb40CHryERGtNKc+i/WBk1v4d3R8FalxkDBwtTm1NIp5KBOakPPKd2/6EEi3W3njze1m+Agk9EUga11lmyuuG3FIfMyevhgUVAHr6b68tTuLaXIm33Jvb06YtvRjaoX2LnzPI9tbEwFQEshhLg9xSs0D855LWllcHR9d4Zbz5UAJKQOSrXZOg3upvnA0dTrpSgtYvXoynD5NRzSoH7cej9vrDJ08vyti2yiGkFjQiHkuz7Et/3chKnJryEb9J74JSu5Yy0WkCgx/C+c0fMNTO1bu23nwGZGXtqenJjbJUBEBUI0WZOGamcTGVYBe6i9u/hkByk6OA0=) 2025-03-22 21:57:10.871326 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJZbyOWUjp3wBGocvWPWQwOqCzyrvDsTNDWCmaDNTPV6wZBsfgM2dkYmtu8pM4AntmoFjtdLsUhWkObDSZjQcyo=) 2025-03-22 21:57:10.871367 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJoGqkvYXBywD0u/zTWWR5pX2GyRLkJhV5ZvIXBdFCj+) 2025-03-22 21:57:10.871749 | orchestrator | 2025-03-22 21:57:10.872150 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:10.872551 | orchestrator | Saturday 22 March 2025 21:57:10 +0000 (0:00:01.377) 0:00:07.861 ******** 2025-03-22 21:57:12.046073 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMByWFocOSsCVjDmlcmUVq61HBuGmM92QWPjNVbemCNP36kZXhCX8+okdIP19PKCaTe5l2CqhIUarrLx6Ec7Z4w=) 2025-03-22 21:57:12.047118 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIChLSQ2qUz6lrlTmK2S9QG0sulzulxC6XscndnrgAV3J) 2025-03-22 21:57:12.047189 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0NUzqmd47r4N9GlQ1cXexx5LvtPF1s+XpUxR42mKrfxICw1e55VhkW3y8TWNYmZ/afDB2e2Mz8prGMJUKF2Jpa0SfnQvgM6GpZMj659Xnz3sjT716j/UpyCNOwL8h+1BAy9Ja2bJ7kii/5TFV8jrBkn3Y/QmI1K9VLdBLYMIJlrOe3AVfaDxCERO3jXQczT23ZmDD38M+cUH+wEEiBhMgsCFbfh6amue8RsutVu7nAEBl9GTNlkmQ/cYmWLbNy9tB4haDlD1//xeyvWe14iGFbbnxQS/HmEyYRNAhf4liRdXYTW09mnFFB+asiFj5p2tD07nxwXQ02sLffBrtt63+6lqD2GnaqgCZvPB1WwCb+3og0F7GzMbjdH7nc5YTXrKSkI+drE6ZSPLyawA3QhHaXRH9muqBdXxsDNFTdPm0Y8uJt7+z0lDpgLaWkKsdfSmvx+HwRknTA+VN+bahDkcZc8XxlS2VZibO7Ben/ox/NCP9v90SLDHORJrrpuZJvG0=) 2025-03-22 21:57:12.047583 | orchestrator | 2025-03-22 21:57:12.048043 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:12.048529 | orchestrator | Saturday 22 March 2025 21:57:12 +0000 (0:00:01.176) 0:00:09.037 ******** 2025-03-22 21:57:13.235768 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDiyPyRC1U5ljsmdxXefuuu7bQhjVzs1MXWuz5vE49KWFZcksyD5ludtrYnDZ8EG5Wxes2KyyUleFEHyzpuz+LI=) 2025-03-22 21:57:13.237415 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCnQF5YKEXACzMwbykrVsVyOpahsQNWyL8Gg/k+37q8GA2AxbVuRT/UlvJZJEbMpCGQ+QbYRvV3AsMShutwyumkGKpl37HaqptReAW3duSoXWIu5fQ6wMa47RiOCduB8HkbskKPX++2kn9nZ0uIUZFhbyDTYGPNutbvFe9inNt29oiElzaK1i7uGbXghOTkbgPjBfCcnAbu/tElT2DhfkySZtd9BnVcfVNV7PGD/HpMXOL5NdPVeeBQlqx0LTGwHV4lNvX6eh4uBzMxUmIRpKTq/tmHnkpsyYBGmLdKldhvy2fs/AUVCc7GAPTU1fG6A+pE8gJhNj3nxQaMmepsblu+fTg9uGaCezirv4jP5KT2+bk5bUlIobeTRLdpKcBackmkyOUwnVDeJMRW4jrqh+MpgVxESbbsQdUP9Kd3vGWwM+hTwOK9g+YUFjIoMKkXTPvPXINerogV20S+n5jYZxsMcv/LPoA/a2kkAqXNZYctXnb9J0/BE9JKRKSJ6Hnb6cc=) 2025-03-22 21:57:13.237462 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINZcbdY1JcXqPLTD/Cz+LIIFu5XctBQuOfdfJeSLTPyH) 2025-03-22 21:57:13.237842 | orchestrator | 2025-03-22 21:57:13.238203 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:13.239098 | orchestrator | Saturday 22 March 2025 21:57:13 +0000 (0:00:01.190) 0:00:10.228 ******** 2025-03-22 21:57:14.414116 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLvCZetzAdTc3qfMPd483Du/FpP0Mk54bojgXmd2IxsxJUohZUsufVFpaMIpLX7VZWwK8A11apU/u/4jahmz6Ys=) 2025-03-22 21:57:14.414527 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCn4IxYt0JkrXvtWKLh6MO7o/9xhYd0nfjMOdrP6lzzE7EUUmoHTSDWrxOlDyrSgMjP8ohg6VvlIPCNaZG2kTvm1CZhGrX3c90aa+C5N0qBN6aUlecT6kM3yPmIoBrlVdew7/BcaYaQkptbb+Zf0uOZVkRZ49akIzDdKlthifJPTFOCIwrK6mGSgDGfcMWi9zYwuQ2G3J/dbjFYgi1gpjd16MLiMprd2nPBHn0X/RY6KAVI50s02GhxWFAu4Pp9sfR5xDuskg4n2/W9dFMoqgzbBXAstREA1Fxsg8EgSroNb1qCoSFfRniK8TN7QG5F4cH4pw+PMGRSyojrluavwR1ucdd0+kTH3TNlTXUelHuyh8yM7+eCECs+pfzyvJWPMRO+Y3zuaGhHyKnT+/RF90NoqeI+u4SGC3ukKSOsDSX0W9KTPM1HPZJNhWl9DrGJErPMiKT1H0py4avaPSouZIG7IxJdRhP8QeJGiyRT7L3d75EAkiNt1kcNTx6lKMTdDB8=) 2025-03-22 21:57:14.415689 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO9qCPAkntlvaOQUcK+nSZ25TGg0zOkA8/ZMrlUYL3P0) 2025-03-22 21:57:14.417017 | orchestrator | 2025-03-22 21:57:14.417746 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:14.418578 | orchestrator | Saturday 22 March 2025 21:57:14 +0000 (0:00:01.176) 0:00:11.405 ******** 2025-03-22 21:57:15.712100 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKfU2I9H9C0FUnjs1ZDVpmLpEDb74Lg+qoTxNMJr7YQ5nQZqb1qa8GpQ2LxIR6WN7PEEg6zFmxG723hk2cbj2K8=) 2025-03-22 21:57:15.712437 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJG6MCAwUMy9sQYKsR98sEzrThrCDFtDJQYk91pVARXe) 2025-03-22 21:57:15.712801 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBpRsOJfeqdp3D/db3hLhHhhHcQHHziE2/saj2Ypdi/DWkHpjFSeRrdZ9kcMb986cG9oMIH1PO5WwQIyA5VvkUjAvEn15oWSaBcOzf8Twq1uUcAehhmk7nzngY3Kmw04b27PQm4NBQgtJ9BxQXZpHRo5KEl966FYYLEUCDnofxAkPsq5yrRNcLXS/JwsIJtDPba1rkIp+cMDXpIRvUMIiMVPjc2Ik2mk9fZvpz+jlopZFhP3k/7sBNDQ69Axmylts/62GVeA18WANY0zpGwC0PkCARNU4GiEEmTPSNyKcULfEaNuQrom9quo8hejXBFVRyAXxeTfX1MZ1h9DArdcQiZmadHN6WpxIz48aP+itwOwXhLI8xVqykfpKWwERmgqmk5NU5bd1aP8eKk9r2xHUBWYNVQa27CvoCR5795T8ZmLD0uFHZRE+H8/2Nlh5O7Cbaqr1jgoiEehqeZZUVjMeEx9Dn70K5G9qlmCrHH2SDQfxjZERKx/er3IsELSSRAW0=) 2025-03-22 21:57:15.712838 | orchestrator | 2025-03-22 21:57:16.913877 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:16.914000 | orchestrator | Saturday 22 March 2025 21:57:15 +0000 (0:00:01.299) 0:00:12.704 ******** 2025-03-22 21:57:16.914132 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDlxNPt1EQXMana1x54VeNXL7e1etw06Fvr2Xb1FRmSh3Ba89HmBiY/Cw5jt98/+wjO++1M5Xf9FYzvKApq43GGsTe5zCG6kq1iXYy2t1vjiaZaQumsncMAHdQnpU99J44zGC1+EZqSJaaGjhRXS+FmGxx0HtSGMiwNP1Su37OfdhM3gR4eQOgyyds6j7SUqvhcicqoSXCywLhi9xbqKm2ViqVxVARF+Un59H4JWFCWvRhKbZpHZyopXGEyw2XV+s013x4Ku66R72kogTuF1ShYpBuZcJGRTnPJxPfQUG+xbocQ4YX3/pd6eY7wyHkckkpaB5r8sw+MQHGKid63eCr+rcNZ5YMtXwUjA/w35DEY2bA+kMr+9XaPqek//luLmQoy2qFI2MV42xaEovKA30ue4qfadAlfgr71wtJlK3hq4IHIs6nzh+ERXjiX1+dDLeo9QZWpZGVuZs8mzjh+ke0okwOzATptxzX2PfLnoDgUrbwwa1z/AfZ+VwEoLIlpmoU=) 2025-03-22 21:57:16.914530 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHW0U4f4NthkAH+f/atcUZRD69lOpowl23k62KCSI5SR/DxlwW0ObySSlHyMD5HQ/WB1oXfr3NdQijRD9MCpVSw=) 2025-03-22 21:57:16.914566 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIALUnc/cWLIukNkyKW5BbpLKiiaEpjPQeVs8NvMpFDsg) 2025-03-22 21:57:16.916830 | orchestrator | 2025-03-22 21:57:16.918581 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:16.919381 | orchestrator | Saturday 22 March 2025 21:57:16 +0000 (0:00:01.199) 0:00:13.903 ******** 2025-03-22 21:57:18.095267 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDzgMH19MX86Qqa2pge16ALTft6aM1/a2FU6sTZqctB/Jg+DamJ9kHP51tqLdxPNDBNLEsqef3wPjGDsPHRc0XI=) 2025-03-22 21:57:18.095454 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIof3OezWDRbF1osMZu27QsPpId4mifoVH5PSMiS5Bf+) 2025-03-22 21:57:18.095492 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkF77hsWAqGpt1dDaloULK6nz+lbezeWv5tLniWVWECR4zRsTAGXFTx/laKRnEQ35qhFTEuKsTnajNOklcUloZTJXB2xwNTiNOheISg3tFsiiuFqDYetNCp3f1Jci7r3436pOusFKUY3Up0kBUs+cQl+iOBp1NiO+iXZ5KMj+/eFS1AEbqKyxrDgweh96qz7+/tqb3jUEczDXx7qXMmI+TFg19gR3uM6sMBuwMGxob+kmTnhFf6zrPGRfRqn7vOk2M0WcplsBmKRMazpS+k6384/rJTrRNlx1OeIe4XkZcEdb1uftwD9kww4p9DWvXdiPkzkcj/EfZ0CFpmDMCXkLChI44NiyadV9ivHDfW9BBK/cpAxOv4bGrsUQNDWXkSPlDed/aYZzyX6VI2rhn2bO1m0Sv/sRqFt69XFmMX5pdQTc3oEFMDgx+vB9FqSYfiVjVoNN3wVTRWFMEjbYjsqdm9NSivzYav9bZsbJOJ4ptf8slXixxt5IqPzNQsDTv5Ls=) 2025-03-22 21:57:18.095813 | orchestrator | 2025-03-22 21:57:18.096939 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host] *** 2025-03-22 21:57:18.097741 | orchestrator | Saturday 22 March 2025 21:57:18 +0000 (0:00:01.184) 0:00:15.088 ******** 2025-03-22 21:57:23.680701 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-03-22 21:57:23.680944 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-03-22 21:57:23.680973 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-03-22 21:57:23.681880 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-03-22 21:57:23.682274 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-03-22 21:57:23.683007 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-03-22 21:57:23.685007 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-03-22 21:57:23.685645 | orchestrator | 2025-03-22 21:57:23.686161 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host] *** 2025-03-22 21:57:23.686190 | orchestrator | Saturday 22 March 2025 21:57:23 +0000 (0:00:05.584) 0:00:20.672 ******** 2025-03-22 21:57:23.870112 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-03-22 21:57:23.872023 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-03-22 21:57:23.872054 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-03-22 21:57:23.872069 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-03-22 21:57:23.872134 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-03-22 21:57:23.872156 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-03-22 21:57:23.872882 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-03-22 21:57:23.872909 | orchestrator | 2025-03-22 21:57:23.872930 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:25.143725 | orchestrator | Saturday 22 March 2025 21:57:23 +0000 (0:00:00.189) 0:00:20.861 ******** 2025-03-22 21:57:25.143864 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDFB+7AELg1igpIxa/UwIO7YGtJWrttHpLLvd4oc4nBcSde2r8SUkj6cPI6GE4xoTF3Kh8GF+B19KxNyOItfsWWCinlRDmc1hjc2sRiKSVSpAoUMD7l0UDxpZp+ZdfagUuWZMobzTaKfR6KcqMZRCQWrVLQ2NyBKA0ZRdNNLQKrIuadDNhYzGTorVb40CHryERGtNKc+i/WBk1v4d3R8FalxkDBwtTm1NIp5KBOakPPKd2/6EEi3W3njze1m+Agk9EUga11lmyuuG3FIfMyevhgUVAHr6b68tTuLaXIm33Jvb06YtvRjaoX2LnzPI9tbEwFQEshhLg9xSs0D855LWllcHR9d4Zbz5UAJKQOSrXZOg3upvnA0dTrpSgtYvXoynD5NRzSoH7cej9vrDJ08vyti2yiGkFjQiHkuz7Et/3chKnJryEb9J74JSu5Yy0WkCgx/C+c0fMNTO1bu23nwGZGXtqenJjbJUBEBUI0WZOGamcTGVYBe6i9u/hkByk6OA0=) 2025-03-22 21:57:25.143956 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJZbyOWUjp3wBGocvWPWQwOqCzyrvDsTNDWCmaDNTPV6wZBsfgM2dkYmtu8pM4AntmoFjtdLsUhWkObDSZjQcyo=) 2025-03-22 21:57:25.144416 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJoGqkvYXBywD0u/zTWWR5pX2GyRLkJhV5ZvIXBdFCj+) 2025-03-22 21:57:25.144783 | orchestrator | 2025-03-22 21:57:25.145585 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:25.145717 | orchestrator | Saturday 22 March 2025 21:57:25 +0000 (0:00:01.273) 0:00:22.135 ******** 2025-03-22 21:57:26.361366 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMByWFocOSsCVjDmlcmUVq61HBuGmM92QWPjNVbemCNP36kZXhCX8+okdIP19PKCaTe5l2CqhIUarrLx6Ec7Z4w=) 2025-03-22 21:57:26.362014 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0NUzqmd47r4N9GlQ1cXexx5LvtPF1s+XpUxR42mKrfxICw1e55VhkW3y8TWNYmZ/afDB2e2Mz8prGMJUKF2Jpa0SfnQvgM6GpZMj659Xnz3sjT716j/UpyCNOwL8h+1BAy9Ja2bJ7kii/5TFV8jrBkn3Y/QmI1K9VLdBLYMIJlrOe3AVfaDxCERO3jXQczT23ZmDD38M+cUH+wEEiBhMgsCFbfh6amue8RsutVu7nAEBl9GTNlkmQ/cYmWLbNy9tB4haDlD1//xeyvWe14iGFbbnxQS/HmEyYRNAhf4liRdXYTW09mnFFB+asiFj5p2tD07nxwXQ02sLffBrtt63+6lqD2GnaqgCZvPB1WwCb+3og0F7GzMbjdH7nc5YTXrKSkI+drE6ZSPLyawA3QhHaXRH9muqBdXxsDNFTdPm0Y8uJt7+z0lDpgLaWkKsdfSmvx+HwRknTA+VN+bahDkcZc8XxlS2VZibO7Ben/ox/NCP9v90SLDHORJrrpuZJvG0=) 2025-03-22 21:57:26.362103 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIChLSQ2qUz6lrlTmK2S9QG0sulzulxC6XscndnrgAV3J) 2025-03-22 21:57:26.362137 | orchestrator | 2025-03-22 21:57:26.362268 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:26.362514 | orchestrator | Saturday 22 March 2025 21:57:26 +0000 (0:00:01.217) 0:00:23.353 ******** 2025-03-22 21:57:27.657065 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDiyPyRC1U5ljsmdxXefuuu7bQhjVzs1MXWuz5vE49KWFZcksyD5ludtrYnDZ8EG5Wxes2KyyUleFEHyzpuz+LI=) 2025-03-22 21:57:27.657930 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCnQF5YKEXACzMwbykrVsVyOpahsQNWyL8Gg/k+37q8GA2AxbVuRT/UlvJZJEbMpCGQ+QbYRvV3AsMShutwyumkGKpl37HaqptReAW3duSoXWIu5fQ6wMa47RiOCduB8HkbskKPX++2kn9nZ0uIUZFhbyDTYGPNutbvFe9inNt29oiElzaK1i7uGbXghOTkbgPjBfCcnAbu/tElT2DhfkySZtd9BnVcfVNV7PGD/HpMXOL5NdPVeeBQlqx0LTGwHV4lNvX6eh4uBzMxUmIRpKTq/tmHnkpsyYBGmLdKldhvy2fs/AUVCc7GAPTU1fG6A+pE8gJhNj3nxQaMmepsblu+fTg9uGaCezirv4jP5KT2+bk5bUlIobeTRLdpKcBackmkyOUwnVDeJMRW4jrqh+MpgVxESbbsQdUP9Kd3vGWwM+hTwOK9g+YUFjIoMKkXTPvPXINerogV20S+n5jYZxsMcv/LPoA/a2kkAqXNZYctXnb9J0/BE9JKRKSJ6Hnb6cc=) 2025-03-22 21:57:27.658134 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINZcbdY1JcXqPLTD/Cz+LIIFu5XctBQuOfdfJeSLTPyH) 2025-03-22 21:57:27.659463 | orchestrator | 2025-03-22 21:57:27.660371 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:27.660756 | orchestrator | Saturday 22 March 2025 21:57:27 +0000 (0:00:01.294) 0:00:24.648 ******** 2025-03-22 21:57:28.776459 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIO9qCPAkntlvaOQUcK+nSZ25TGg0zOkA8/ZMrlUYL3P0) 2025-03-22 21:57:28.778352 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCn4IxYt0JkrXvtWKLh6MO7o/9xhYd0nfjMOdrP6lzzE7EUUmoHTSDWrxOlDyrSgMjP8ohg6VvlIPCNaZG2kTvm1CZhGrX3c90aa+C5N0qBN6aUlecT6kM3yPmIoBrlVdew7/BcaYaQkptbb+Zf0uOZVkRZ49akIzDdKlthifJPTFOCIwrK6mGSgDGfcMWi9zYwuQ2G3J/dbjFYgi1gpjd16MLiMprd2nPBHn0X/RY6KAVI50s02GhxWFAu4Pp9sfR5xDuskg4n2/W9dFMoqgzbBXAstREA1Fxsg8EgSroNb1qCoSFfRniK8TN7QG5F4cH4pw+PMGRSyojrluavwR1ucdd0+kTH3TNlTXUelHuyh8yM7+eCECs+pfzyvJWPMRO+Y3zuaGhHyKnT+/RF90NoqeI+u4SGC3ukKSOsDSX0W9KTPM1HPZJNhWl9DrGJErPMiKT1H0py4avaPSouZIG7IxJdRhP8QeJGiyRT7L3d75EAkiNt1kcNTx6lKMTdDB8=) 2025-03-22 21:57:28.779356 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLvCZetzAdTc3qfMPd483Du/FpP0Mk54bojgXmd2IxsxJUohZUsufVFpaMIpLX7VZWwK8A11apU/u/4jahmz6Ys=) 2025-03-22 21:57:28.779391 | orchestrator | 2025-03-22 21:57:28.780232 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:28.781242 | orchestrator | Saturday 22 March 2025 21:57:28 +0000 (0:00:01.120) 0:00:25.769 ******** 2025-03-22 21:57:29.970260 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJG6MCAwUMy9sQYKsR98sEzrThrCDFtDJQYk91pVARXe) 2025-03-22 21:57:29.970928 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBpRsOJfeqdp3D/db3hLhHhhHcQHHziE2/saj2Ypdi/DWkHpjFSeRrdZ9kcMb986cG9oMIH1PO5WwQIyA5VvkUjAvEn15oWSaBcOzf8Twq1uUcAehhmk7nzngY3Kmw04b27PQm4NBQgtJ9BxQXZpHRo5KEl966FYYLEUCDnofxAkPsq5yrRNcLXS/JwsIJtDPba1rkIp+cMDXpIRvUMIiMVPjc2Ik2mk9fZvpz+jlopZFhP3k/7sBNDQ69Axmylts/62GVeA18WANY0zpGwC0PkCARNU4GiEEmTPSNyKcULfEaNuQrom9quo8hejXBFVRyAXxeTfX1MZ1h9DArdcQiZmadHN6WpxIz48aP+itwOwXhLI8xVqykfpKWwERmgqmk5NU5bd1aP8eKk9r2xHUBWYNVQa27CvoCR5795T8ZmLD0uFHZRE+H8/2Nlh5O7Cbaqr1jgoiEehqeZZUVjMeEx9Dn70K5G9qlmCrHH2SDQfxjZERKx/er3IsELSSRAW0=) 2025-03-22 21:57:29.971896 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKfU2I9H9C0FUnjs1ZDVpmLpEDb74Lg+qoTxNMJr7YQ5nQZqb1qa8GpQ2LxIR6WN7PEEg6zFmxG723hk2cbj2K8=) 2025-03-22 21:57:29.972834 | orchestrator | 2025-03-22 21:57:29.974357 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:29.975075 | orchestrator | Saturday 22 March 2025 21:57:29 +0000 (0:00:01.193) 0:00:26.962 ******** 2025-03-22 21:57:31.177896 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIALUnc/cWLIukNkyKW5BbpLKiiaEpjPQeVs8NvMpFDsg) 2025-03-22 21:57:31.178119 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDlxNPt1EQXMana1x54VeNXL7e1etw06Fvr2Xb1FRmSh3Ba89HmBiY/Cw5jt98/+wjO++1M5Xf9FYzvKApq43GGsTe5zCG6kq1iXYy2t1vjiaZaQumsncMAHdQnpU99J44zGC1+EZqSJaaGjhRXS+FmGxx0HtSGMiwNP1Su37OfdhM3gR4eQOgyyds6j7SUqvhcicqoSXCywLhi9xbqKm2ViqVxVARF+Un59H4JWFCWvRhKbZpHZyopXGEyw2XV+s013x4Ku66R72kogTuF1ShYpBuZcJGRTnPJxPfQUG+xbocQ4YX3/pd6eY7wyHkckkpaB5r8sw+MQHGKid63eCr+rcNZ5YMtXwUjA/w35DEY2bA+kMr+9XaPqek//luLmQoy2qFI2MV42xaEovKA30ue4qfadAlfgr71wtJlK3hq4IHIs6nzh+ERXjiX1+dDLeo9QZWpZGVuZs8mzjh+ke0okwOzATptxzX2PfLnoDgUrbwwa1z/AfZ+VwEoLIlpmoU=) 2025-03-22 21:57:31.178180 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHW0U4f4NthkAH+f/atcUZRD69lOpowl23k62KCSI5SR/DxlwW0ObySSlHyMD5HQ/WB1oXfr3NdQijRD9MCpVSw=) 2025-03-22 21:57:31.178206 | orchestrator | 2025-03-22 21:57:31.179084 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-03-22 21:57:31.180078 | orchestrator | Saturday 22 March 2025 21:57:31 +0000 (0:00:01.206) 0:00:28.168 ******** 2025-03-22 21:57:32.359111 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIof3OezWDRbF1osMZu27QsPpId4mifoVH5PSMiS5Bf+) 2025-03-22 21:57:32.361195 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkF77hsWAqGpt1dDaloULK6nz+lbezeWv5tLniWVWECR4zRsTAGXFTx/laKRnEQ35qhFTEuKsTnajNOklcUloZTJXB2xwNTiNOheISg3tFsiiuFqDYetNCp3f1Jci7r3436pOusFKUY3Up0kBUs+cQl+iOBp1NiO+iXZ5KMj+/eFS1AEbqKyxrDgweh96qz7+/tqb3jUEczDXx7qXMmI+TFg19gR3uM6sMBuwMGxob+kmTnhFf6zrPGRfRqn7vOk2M0WcplsBmKRMazpS+k6384/rJTrRNlx1OeIe4XkZcEdb1uftwD9kww4p9DWvXdiPkzkcj/EfZ0CFpmDMCXkLChI44NiyadV9ivHDfW9BBK/cpAxOv4bGrsUQNDWXkSPlDed/aYZzyX6VI2rhn2bO1m0Sv/sRqFt69XFmMX5pdQTc3oEFMDgx+vB9FqSYfiVjVoNN3wVTRWFMEjbYjsqdm9NSivzYav9bZsbJOJ4ptf8slXixxt5IqPzNQsDTv5Ls=) 2025-03-22 21:57:32.361250 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDzgMH19MX86Qqa2pge16ALTft6aM1/a2FU6sTZqctB/Jg+DamJ9kHP51tqLdxPNDBNLEsqef3wPjGDsPHRc0XI=) 2025-03-22 21:57:32.362112 | orchestrator | 2025-03-22 21:57:32.362142 | orchestrator | TASK [osism.commons.known_hosts : Write static known_hosts entries] ************ 2025-03-22 21:57:32.362167 | orchestrator | Saturday 22 March 2025 21:57:32 +0000 (0:00:01.178) 0:00:29.347 ******** 2025-03-22 21:57:32.551058 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-03-22 21:57:32.552544 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-03-22 21:57:32.553138 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-03-22 21:57:32.554939 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-03-22 21:57:32.556321 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-03-22 21:57:32.556943 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-03-22 21:57:32.558241 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-03-22 21:57:32.558956 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:57:32.559801 | orchestrator | 2025-03-22 21:57:32.560515 | orchestrator | TASK [osism.commons.known_hosts : Write extra known_hosts entries] ************* 2025-03-22 21:57:32.561604 | orchestrator | Saturday 22 March 2025 21:57:32 +0000 (0:00:00.195) 0:00:29.543 ******** 2025-03-22 21:57:32.639344 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:57:32.698728 | orchestrator | 2025-03-22 21:57:32.698771 | orchestrator | TASK [osism.commons.known_hosts : Delete known_hosts entries] ****************** 2025-03-22 21:57:32.698786 | orchestrator | Saturday 22 March 2025 21:57:32 +0000 (0:00:00.085) 0:00:29.628 ******** 2025-03-22 21:57:32.698808 | orchestrator | skipping: [testbed-manager] 2025-03-22 21:57:32.699505 | orchestrator | 2025-03-22 21:57:32.699601 | orchestrator | TASK [osism.commons.known_hosts : Set file permissions] ************************ 2025-03-22 21:57:32.700406 | orchestrator | Saturday 22 March 2025 21:57:32 +0000 (0:00:00.064) 0:00:29.693 ******** 2025-03-22 21:57:33.543097 | orchestrator | changed: [testbed-manager] 2025-03-22 21:57:33.543439 | orchestrator | 2025-03-22 21:57:33.544000 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:57:33.544706 | orchestrator | 2025-03-22 21:57:33 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 21:57:33.545197 | orchestrator | 2025-03-22 21:57:33 | INFO  | Please wait and do not abort execution. 2025-03-22 21:57:33.546315 | orchestrator | testbed-manager : ok=31  changed=15  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 21:57:33.547289 | orchestrator | 2025-03-22 21:57:33.547323 | orchestrator | Saturday 22 March 2025 21:57:33 +0000 (0:00:00.843) 0:00:30.536 ******** 2025-03-22 21:57:33.547649 | orchestrator | =============================================================================== 2025-03-22 21:57:33.548432 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname --- 6.16s 2025-03-22 21:57:33.549265 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host --- 5.58s 2025-03-22 21:57:33.550143 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.38s 2025-03-22 21:57:33.550610 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.30s 2025-03-22 21:57:33.550937 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.29s 2025-03-22 21:57:33.551643 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.27s 2025-03-22 21:57:33.551888 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.22s 2025-03-22 21:57:33.552215 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.21s 2025-03-22 21:57:33.552872 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.20s 2025-03-22 21:57:33.553592 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.19s 2025-03-22 21:57:33.554391 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.19s 2025-03-22 21:57:33.554966 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.18s 2025-03-22 21:57:33.555543 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.18s 2025-03-22 21:57:33.555938 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.18s 2025-03-22 21:57:33.556381 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.18s 2025-03-22 21:57:33.556835 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.12s 2025-03-22 21:57:33.557531 | orchestrator | osism.commons.known_hosts : Set file permissions ------------------------ 0.84s 2025-03-22 21:57:33.558101 | orchestrator | osism.commons.known_hosts : Write static known_hosts entries ------------ 0.20s 2025-03-22 21:57:33.559482 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host --- 0.19s 2025-03-22 21:57:33.559644 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname --- 0.18s 2025-03-22 21:57:34.096041 | orchestrator | + osism apply squid 2025-03-22 21:57:35.906191 | orchestrator | 2025-03-22 21:57:35 | INFO  | Task f77161f3-19a0-4a8d-99f6-6ab0967f26fe (squid) was prepared for execution. 2025-03-22 21:57:39.579189 | orchestrator | 2025-03-22 21:57:35 | INFO  | It takes a moment until task f77161f3-19a0-4a8d-99f6-6ab0967f26fe (squid) has been started and output is visible here. 2025-03-22 21:57:39.579353 | orchestrator | 2025-03-22 21:57:39.579429 | orchestrator | PLAY [Apply role squid] ******************************************************** 2025-03-22 21:57:39.580940 | orchestrator | 2025-03-22 21:57:39.580985 | orchestrator | TASK [osism.services.squid : Include install tasks] **************************** 2025-03-22 21:57:39.581006 | orchestrator | Saturday 22 March 2025 21:57:39 +0000 (0:00:00.133) 0:00:00.133 ******** 2025-03-22 21:57:39.715986 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/squid/tasks/install-Debian-family.yml for testbed-manager 2025-03-22 21:57:39.716271 | orchestrator | 2025-03-22 21:57:39.717661 | orchestrator | TASK [osism.services.squid : Install required packages] ************************ 2025-03-22 21:57:39.719389 | orchestrator | Saturday 22 March 2025 21:57:39 +0000 (0:00:00.138) 0:00:00.272 ******** 2025-03-22 21:57:41.420022 | orchestrator | ok: [testbed-manager] 2025-03-22 21:57:41.420653 | orchestrator | 2025-03-22 21:57:41.420794 | orchestrator | TASK [osism.services.squid : Create required directories] ********************** 2025-03-22 21:57:42.750265 | orchestrator | Saturday 22 March 2025 21:57:41 +0000 (0:00:01.704) 0:00:01.976 ******** 2025-03-22 21:57:42.750331 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration) 2025-03-22 21:57:42.750573 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration/conf.d) 2025-03-22 21:57:42.751190 | orchestrator | ok: [testbed-manager] => (item=/opt/squid) 2025-03-22 21:57:42.752002 | orchestrator | 2025-03-22 21:57:42.752335 | orchestrator | TASK [osism.services.squid : Copy squid configuration files] ******************* 2025-03-22 21:57:42.753951 | orchestrator | Saturday 22 March 2025 21:57:42 +0000 (0:00:01.330) 0:00:03.306 ******** 2025-03-22 21:57:43.960659 | orchestrator | changed: [testbed-manager] => (item=osism.conf) 2025-03-22 21:57:43.961365 | orchestrator | 2025-03-22 21:57:43.961871 | orchestrator | TASK [osism.services.squid : Remove osism_allow_list.conf configuration file] *** 2025-03-22 21:57:43.962102 | orchestrator | Saturday 22 March 2025 21:57:43 +0000 (0:00:01.211) 0:00:04.518 ******** 2025-03-22 21:57:44.415833 | orchestrator | ok: [testbed-manager] 2025-03-22 21:57:45.450848 | orchestrator | 2025-03-22 21:57:45.450976 | orchestrator | TASK [osism.services.squid : Copy docker-compose.yml file] ********************* 2025-03-22 21:57:45.450988 | orchestrator | Saturday 22 March 2025 21:57:44 +0000 (0:00:00.454) 0:00:04.973 ******** 2025-03-22 21:57:45.451009 | orchestrator | changed: [testbed-manager] 2025-03-22 21:57:45.451051 | orchestrator | 2025-03-22 21:57:45.451065 | orchestrator | TASK [osism.services.squid : Manage squid service] ***************************** 2025-03-22 21:57:45.452303 | orchestrator | Saturday 22 March 2025 21:57:45 +0000 (0:00:01.035) 0:00:06.008 ******** 2025-03-22 21:58:13.541535 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage squid service (10 retries left). 2025-03-22 21:58:25.869541 | orchestrator | ok: [testbed-manager] 2025-03-22 21:58:25.869695 | orchestrator | 2025-03-22 21:58:25.869712 | orchestrator | RUNNING HANDLER [osism.services.squid : Restart squid service] ***************** 2025-03-22 21:58:25.869726 | orchestrator | Saturday 22 March 2025 21:58:13 +0000 (0:00:28.088) 0:00:34.097 ******** 2025-03-22 21:58:25.869750 | orchestrator | changed: [testbed-manager] 2025-03-22 21:59:25.946494 | orchestrator | 2025-03-22 21:59:25.946653 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for squid service to start] ******* 2025-03-22 21:59:25.946675 | orchestrator | Saturday 22 March 2025 21:58:25 +0000 (0:00:12.328) 0:00:46.426 ******** 2025-03-22 21:59:25.946707 | orchestrator | Pausing for 60 seconds 2025-03-22 21:59:25.946793 | orchestrator | changed: [testbed-manager] 2025-03-22 21:59:25.946818 | orchestrator | 2025-03-22 21:59:26.009899 | orchestrator | RUNNING HANDLER [osism.services.squid : Register that squid service was restarted] *** 2025-03-22 21:59:26.009985 | orchestrator | Saturday 22 March 2025 21:59:25 +0000 (0:01:00.073) 0:01:46.499 ******** 2025-03-22 21:59:26.010064 | orchestrator | ok: [testbed-manager] 2025-03-22 21:59:26.010431 | orchestrator | 2025-03-22 21:59:26.011179 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for an healthy squid service] ***** 2025-03-22 21:59:26.011922 | orchestrator | Saturday 22 March 2025 21:59:26 +0000 (0:00:00.069) 0:01:46.569 ******** 2025-03-22 21:59:26.707111 | orchestrator | changed: [testbed-manager] 2025-03-22 21:59:26.707463 | orchestrator | 2025-03-22 21:59:26.707511 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:59:26.707667 | orchestrator | 2025-03-22 21:59:26 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 21:59:26.707773 | orchestrator | 2025-03-22 21:59:26 | INFO  | Please wait and do not abort execution. 2025-03-22 21:59:26.707799 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 21:59:26.708767 | orchestrator | 2025-03-22 21:59:26.709260 | orchestrator | Saturday 22 March 2025 21:59:26 +0000 (0:00:00.697) 0:01:47.266 ******** 2025-03-22 21:59:26.709328 | orchestrator | =============================================================================== 2025-03-22 21:59:26.710409 | orchestrator | osism.services.squid : Wait for squid service to start ----------------- 60.07s 2025-03-22 21:59:26.710633 | orchestrator | osism.services.squid : Manage squid service ---------------------------- 28.09s 2025-03-22 21:59:26.711122 | orchestrator | osism.services.squid : Restart squid service --------------------------- 12.33s 2025-03-22 21:59:26.711182 | orchestrator | osism.services.squid : Install required packages ------------------------ 1.70s 2025-03-22 21:59:26.711402 | orchestrator | osism.services.squid : Create required directories ---------------------- 1.33s 2025-03-22 21:59:26.711450 | orchestrator | osism.services.squid : Copy squid configuration files ------------------- 1.21s 2025-03-22 21:59:26.711633 | orchestrator | osism.services.squid : Copy docker-compose.yml file --------------------- 1.04s 2025-03-22 21:59:26.711970 | orchestrator | osism.services.squid : Wait for an healthy squid service ---------------- 0.70s 2025-03-22 21:59:26.712488 | orchestrator | osism.services.squid : Remove osism_allow_list.conf configuration file --- 0.45s 2025-03-22 21:59:26.712813 | orchestrator | osism.services.squid : Include install tasks ---------------------------- 0.14s 2025-03-22 21:59:26.712921 | orchestrator | osism.services.squid : Register that squid service was restarted -------- 0.07s 2025-03-22 21:59:27.335724 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-22 21:59:27.340828 | orchestrator | + sed -i 's#docker_namespace: kolla#docker_namespace: kolla/release#' /opt/configuration/inventory/group_vars/all/kolla.yml 2025-03-22 21:59:27.340873 | orchestrator | ++ semver 8.1.0 9.0.0 2025-03-22 21:59:27.380787 | orchestrator | + [[ -1 -lt 0 ]] 2025-03-22 21:59:27.385016 | orchestrator | + [[ 8.1.0 != \l\a\t\e\s\t ]] 2025-03-22 21:59:27.385100 | orchestrator | + sed -i 's|^# \(network_dispatcher_scripts:\)$|\1|g' /opt/configuration/inventory/group_vars/testbed-nodes.yml 2025-03-22 21:59:27.385135 | orchestrator | + sed -i 's|^# \( - src: /opt/configuration/network/vxlan.sh\)$|\1|g' /opt/configuration/inventory/group_vars/testbed-nodes.yml /opt/configuration/inventory/group_vars/testbed-managers.yml 2025-03-22 21:59:27.388623 | orchestrator | + sed -i 's|^# \( dest: routable.d/vxlan.sh\)$|\1|g' /opt/configuration/inventory/group_vars/testbed-nodes.yml /opt/configuration/inventory/group_vars/testbed-managers.yml 2025-03-22 21:59:27.392738 | orchestrator | + osism apply operator -u ubuntu -l testbed-nodes 2025-03-22 21:59:29.476254 | orchestrator | 2025-03-22 21:59:29 | INFO  | Task d7fe4bfa-9a0a-415f-a396-51a9c8894872 (operator) was prepared for execution. 2025-03-22 21:59:33.567887 | orchestrator | 2025-03-22 21:59:29 | INFO  | It takes a moment until task d7fe4bfa-9a0a-415f-a396-51a9c8894872 (operator) has been started and output is visible here. 2025-03-22 21:59:33.567985 | orchestrator | 2025-03-22 21:59:33.568751 | orchestrator | PLAY [Make ssh pipelining working] ********************************************* 2025-03-22 21:59:33.568784 | orchestrator | 2025-03-22 21:59:33.570394 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-03-22 21:59:33.571128 | orchestrator | Saturday 22 March 2025 21:59:33 +0000 (0:00:00.114) 0:00:00.114 ******** 2025-03-22 21:59:37.234443 | orchestrator | ok: [testbed-node-0] 2025-03-22 21:59:37.235161 | orchestrator | ok: [testbed-node-1] 2025-03-22 21:59:37.236751 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:37.237765 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:37.238692 | orchestrator | ok: [testbed-node-2] 2025-03-22 21:59:37.239279 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:37.242299 | orchestrator | 2025-03-22 21:59:37.243378 | orchestrator | TASK [Do not require tty for all users] **************************************** 2025-03-22 21:59:37.243813 | orchestrator | Saturday 22 March 2025 21:59:37 +0000 (0:00:03.668) 0:00:03.783 ******** 2025-03-22 21:59:38.140500 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:38.145851 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:38.145891 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:38.146219 | orchestrator | ok: [testbed-node-1] 2025-03-22 21:59:38.147210 | orchestrator | ok: [testbed-node-2] 2025-03-22 21:59:38.149278 | orchestrator | ok: [testbed-node-0] 2025-03-22 21:59:38.149307 | orchestrator | 2025-03-22 21:59:38.149821 | orchestrator | PLAY [Apply role operator] ***************************************************** 2025-03-22 21:59:38.151409 | orchestrator | 2025-03-22 21:59:38.152673 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-03-22 21:59:38.153369 | orchestrator | Saturday 22 March 2025 21:59:38 +0000 (0:00:00.900) 0:00:04.684 ******** 2025-03-22 21:59:38.239841 | orchestrator | ok: [testbed-node-0] 2025-03-22 21:59:38.270180 | orchestrator | ok: [testbed-node-1] 2025-03-22 21:59:38.304807 | orchestrator | ok: [testbed-node-2] 2025-03-22 21:59:38.356986 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:38.360422 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:38.361091 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:38.361123 | orchestrator | 2025-03-22 21:59:38.361847 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-03-22 21:59:38.364689 | orchestrator | Saturday 22 March 2025 21:59:38 +0000 (0:00:00.222) 0:00:04.906 ******** 2025-03-22 21:59:38.436549 | orchestrator | ok: [testbed-node-0] 2025-03-22 21:59:38.501743 | orchestrator | ok: [testbed-node-2] 2025-03-22 21:59:38.568024 | orchestrator | ok: [testbed-node-1] 2025-03-22 21:59:38.568763 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:38.569176 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:38.570630 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:38.570736 | orchestrator | 2025-03-22 21:59:38.573599 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-03-22 21:59:38.574469 | orchestrator | Saturday 22 March 2025 21:59:38 +0000 (0:00:00.210) 0:00:05.117 ******** 2025-03-22 21:59:39.353586 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:39.354195 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:39.354788 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:39.356586 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:39.357108 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:39.357722 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:39.358134 | orchestrator | 2025-03-22 21:59:39.358871 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-03-22 21:59:39.359262 | orchestrator | Saturday 22 March 2025 21:59:39 +0000 (0:00:00.784) 0:00:05.901 ******** 2025-03-22 21:59:40.277802 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:40.278209 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:40.278952 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:40.282683 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:41.571163 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:41.571284 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:41.571304 | orchestrator | 2025-03-22 21:59:41.571321 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-03-22 21:59:41.571337 | orchestrator | Saturday 22 March 2025 21:59:40 +0000 (0:00:00.921) 0:00:06.823 ******** 2025-03-22 21:59:41.571368 | orchestrator | changed: [testbed-node-0] => (item=adm) 2025-03-22 21:59:41.572849 | orchestrator | changed: [testbed-node-1] => (item=adm) 2025-03-22 21:59:41.573175 | orchestrator | changed: [testbed-node-2] => (item=adm) 2025-03-22 21:59:41.573891 | orchestrator | changed: [testbed-node-3] => (item=adm) 2025-03-22 21:59:41.574098 | orchestrator | changed: [testbed-node-4] => (item=adm) 2025-03-22 21:59:41.574438 | orchestrator | changed: [testbed-node-5] => (item=adm) 2025-03-22 21:59:41.574786 | orchestrator | changed: [testbed-node-0] => (item=sudo) 2025-03-22 21:59:41.575255 | orchestrator | changed: [testbed-node-1] => (item=sudo) 2025-03-22 21:59:41.575743 | orchestrator | changed: [testbed-node-4] => (item=sudo) 2025-03-22 21:59:41.578638 | orchestrator | changed: [testbed-node-3] => (item=sudo) 2025-03-22 21:59:41.579297 | orchestrator | changed: [testbed-node-5] => (item=sudo) 2025-03-22 21:59:41.579771 | orchestrator | changed: [testbed-node-2] => (item=sudo) 2025-03-22 21:59:41.580332 | orchestrator | 2025-03-22 21:59:41.580846 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-03-22 21:59:41.581429 | orchestrator | Saturday 22 March 2025 21:59:41 +0000 (0:00:01.292) 0:00:08.116 ******** 2025-03-22 21:59:43.114606 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:43.115023 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:43.115419 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:43.116124 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:43.116842 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:43.117502 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:43.117657 | orchestrator | 2025-03-22 21:59:43.118555 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-03-22 21:59:44.467334 | orchestrator | Saturday 22 March 2025 21:59:43 +0000 (0:00:01.547) 0:00:09.663 ******** 2025-03-22 21:59:44.467470 | orchestrator | [WARNING]: Module remote_tmp /root/.ansible/tmp did not exist and was created 2025-03-22 21:59:44.467900 | orchestrator | with a mode of 0700, this may cause issues when running as another user. To 2025-03-22 21:59:44.467935 | orchestrator | avoid this, create the remote_tmp dir with the correct permissions manually 2025-03-22 21:59:44.564071 | orchestrator | changed: [testbed-node-3] => (item=export LANGUAGE=C.UTF-8) 2025-03-22 21:59:44.564579 | orchestrator | changed: [testbed-node-0] => (item=export LANGUAGE=C.UTF-8) 2025-03-22 21:59:44.564679 | orchestrator | changed: [testbed-node-2] => (item=export LANGUAGE=C.UTF-8) 2025-03-22 21:59:44.565180 | orchestrator | changed: [testbed-node-1] => (item=export LANGUAGE=C.UTF-8) 2025-03-22 21:59:44.565687 | orchestrator | changed: [testbed-node-4] => (item=export LANGUAGE=C.UTF-8) 2025-03-22 21:59:44.566138 | orchestrator | changed: [testbed-node-5] => (item=export LANGUAGE=C.UTF-8) 2025-03-22 21:59:44.566628 | orchestrator | changed: [testbed-node-0] => (item=export LANG=C.UTF-8) 2025-03-22 21:59:44.566701 | orchestrator | changed: [testbed-node-4] => (item=export LANG=C.UTF-8) 2025-03-22 21:59:44.567194 | orchestrator | changed: [testbed-node-3] => (item=export LANG=C.UTF-8) 2025-03-22 21:59:44.569542 | orchestrator | changed: [testbed-node-2] => (item=export LANG=C.UTF-8) 2025-03-22 21:59:44.569789 | orchestrator | changed: [testbed-node-5] => (item=export LANG=C.UTF-8) 2025-03-22 21:59:44.569814 | orchestrator | changed: [testbed-node-1] => (item=export LANG=C.UTF-8) 2025-03-22 21:59:44.569829 | orchestrator | changed: [testbed-node-4] => (item=export LC_ALL=C.UTF-8) 2025-03-22 21:59:44.569844 | orchestrator | changed: [testbed-node-0] => (item=export LC_ALL=C.UTF-8) 2025-03-22 21:59:44.569863 | orchestrator | changed: [testbed-node-3] => (item=export LC_ALL=C.UTF-8) 2025-03-22 21:59:44.570256 | orchestrator | changed: [testbed-node-2] => (item=export LC_ALL=C.UTF-8) 2025-03-22 21:59:44.570788 | orchestrator | changed: [testbed-node-1] => (item=export LC_ALL=C.UTF-8) 2025-03-22 21:59:44.571922 | orchestrator | changed: [testbed-node-5] => (item=export LC_ALL=C.UTF-8) 2025-03-22 21:59:44.572773 | orchestrator | 2025-03-22 21:59:44.572804 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-03-22 21:59:44.573693 | orchestrator | Saturday 22 March 2025 21:59:44 +0000 (0:00:01.449) 0:00:11.113 ******** 2025-03-22 21:59:45.212002 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:45.213437 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:45.213476 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:45.213491 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:45.213537 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:45.213555 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:45.213578 | orchestrator | 2025-03-22 21:59:45.283766 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-03-22 21:59:45.283806 | orchestrator | Saturday 22 March 2025 21:59:45 +0000 (0:00:00.643) 0:00:11.757 ******** 2025-03-22 21:59:45.283829 | orchestrator | skipping: [testbed-node-0] 2025-03-22 21:59:45.332824 | orchestrator | skipping: [testbed-node-1] 2025-03-22 21:59:45.362135 | orchestrator | skipping: [testbed-node-2] 2025-03-22 21:59:45.419806 | orchestrator | skipping: [testbed-node-3] 2025-03-22 21:59:45.420750 | orchestrator | skipping: [testbed-node-4] 2025-03-22 21:59:45.420788 | orchestrator | skipping: [testbed-node-5] 2025-03-22 21:59:45.420810 | orchestrator | 2025-03-22 21:59:45.421376 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-03-22 21:59:45.421676 | orchestrator | Saturday 22 March 2025 21:59:45 +0000 (0:00:00.211) 0:00:11.968 ******** 2025-03-22 21:59:46.255690 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-03-22 21:59:46.256166 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:46.256197 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-03-22 21:59:46.256853 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:46.257616 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-03-22 21:59:46.259051 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:46.260701 | orchestrator | changed: [testbed-node-2] => (item=None) 2025-03-22 21:59:46.261287 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:46.261648 | orchestrator | changed: [testbed-node-1] => (item=None) 2025-03-22 21:59:46.262363 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-03-22 21:59:46.262623 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:46.264243 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:46.312365 | orchestrator | 2025-03-22 21:59:46.312390 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-03-22 21:59:46.312401 | orchestrator | Saturday 22 March 2025 21:59:46 +0000 (0:00:00.835) 0:00:12.804 ******** 2025-03-22 21:59:46.312416 | orchestrator | skipping: [testbed-node-0] 2025-03-22 21:59:46.336378 | orchestrator | skipping: [testbed-node-1] 2025-03-22 21:59:46.357837 | orchestrator | skipping: [testbed-node-2] 2025-03-22 21:59:46.386177 | orchestrator | skipping: [testbed-node-3] 2025-03-22 21:59:46.421831 | orchestrator | skipping: [testbed-node-4] 2025-03-22 21:59:46.422492 | orchestrator | skipping: [testbed-node-5] 2025-03-22 21:59:46.422944 | orchestrator | 2025-03-22 21:59:46.423706 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-03-22 21:59:46.423848 | orchestrator | Saturday 22 March 2025 21:59:46 +0000 (0:00:00.167) 0:00:12.971 ******** 2025-03-22 21:59:46.480080 | orchestrator | skipping: [testbed-node-0] 2025-03-22 21:59:46.507659 | orchestrator | skipping: [testbed-node-1] 2025-03-22 21:59:46.538838 | orchestrator | skipping: [testbed-node-2] 2025-03-22 21:59:46.579788 | orchestrator | skipping: [testbed-node-3] 2025-03-22 21:59:46.620217 | orchestrator | skipping: [testbed-node-4] 2025-03-22 21:59:46.621160 | orchestrator | skipping: [testbed-node-5] 2025-03-22 21:59:46.621885 | orchestrator | 2025-03-22 21:59:46.623635 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-03-22 21:59:46.679376 | orchestrator | Saturday 22 March 2025 21:59:46 +0000 (0:00:00.198) 0:00:13.169 ******** 2025-03-22 21:59:46.679499 | orchestrator | skipping: [testbed-node-0] 2025-03-22 21:59:46.705161 | orchestrator | skipping: [testbed-node-1] 2025-03-22 21:59:46.727027 | orchestrator | skipping: [testbed-node-2] 2025-03-22 21:59:46.757727 | orchestrator | skipping: [testbed-node-3] 2025-03-22 21:59:46.793329 | orchestrator | skipping: [testbed-node-4] 2025-03-22 21:59:46.793588 | orchestrator | skipping: [testbed-node-5] 2025-03-22 21:59:46.794138 | orchestrator | 2025-03-22 21:59:46.794743 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-03-22 21:59:46.795207 | orchestrator | Saturday 22 March 2025 21:59:46 +0000 (0:00:00.173) 0:00:13.343 ******** 2025-03-22 21:59:47.538169 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:47.538602 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:47.538643 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:47.538669 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:47.538992 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:47.539375 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:47.539592 | orchestrator | 2025-03-22 21:59:47.540194 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-03-22 21:59:47.540927 | orchestrator | Saturday 22 March 2025 21:59:47 +0000 (0:00:00.742) 0:00:14.086 ******** 2025-03-22 21:59:47.643265 | orchestrator | skipping: [testbed-node-0] 2025-03-22 21:59:47.667246 | orchestrator | skipping: [testbed-node-1] 2025-03-22 21:59:47.687883 | orchestrator | skipping: [testbed-node-2] 2025-03-22 21:59:47.784029 | orchestrator | skipping: [testbed-node-3] 2025-03-22 21:59:47.784759 | orchestrator | skipping: [testbed-node-4] 2025-03-22 21:59:47.784832 | orchestrator | skipping: [testbed-node-5] 2025-03-22 21:59:47.785291 | orchestrator | 2025-03-22 21:59:47.792167 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 21:59:47.793062 | orchestrator | 2025-03-22 21:59:47 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 21:59:47.793106 | orchestrator | 2025-03-22 21:59:47 | INFO  | Please wait and do not abort execution. 2025-03-22 21:59:47.793126 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 21:59:47.793149 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 21:59:47.795805 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 21:59:47.795968 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 21:59:47.796630 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 21:59:47.796996 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 21:59:47.797472 | orchestrator | 2025-03-22 21:59:47.798150 | orchestrator | Saturday 22 March 2025 21:59:47 +0000 (0:00:00.248) 0:00:14.334 ******** 2025-03-22 21:59:47.798676 | orchestrator | =============================================================================== 2025-03-22 21:59:47.799011 | orchestrator | Gathering Facts --------------------------------------------------------- 3.67s 2025-03-22 21:59:47.801818 | orchestrator | osism.commons.operator : Copy user sudoers file ------------------------- 1.55s 2025-03-22 21:59:47.801992 | orchestrator | osism.commons.operator : Set language variables in .bashrc configuration file --- 1.45s 2025-03-22 21:59:47.802069 | orchestrator | osism.commons.operator : Add user to additional groups ------------------ 1.29s 2025-03-22 21:59:47.802299 | orchestrator | osism.commons.operator : Create user ------------------------------------ 0.92s 2025-03-22 21:59:47.802635 | orchestrator | Do not require tty for all users ---------------------------------------- 0.90s 2025-03-22 21:59:47.802868 | orchestrator | osism.commons.operator : Set ssh authorized keys ------------------------ 0.84s 2025-03-22 21:59:47.804300 | orchestrator | osism.commons.operator : Create operator group -------------------------- 0.78s 2025-03-22 21:59:47.804623 | orchestrator | osism.commons.operator : Set password ----------------------------------- 0.74s 2025-03-22 21:59:47.804667 | orchestrator | osism.commons.operator : Create .ssh directory -------------------------- 0.64s 2025-03-22 21:59:47.804684 | orchestrator | osism.commons.operator : Unset & lock password -------------------------- 0.25s 2025-03-22 21:59:47.804698 | orchestrator | osism.commons.operator : Gather variables for each operating system ----- 0.22s 2025-03-22 21:59:47.804718 | orchestrator | osism.commons.operator : Check number of SSH authorized keys ------------ 0.21s 2025-03-22 21:59:47.805258 | orchestrator | osism.commons.operator : Set operator_groups variable to default value --- 0.21s 2025-03-22 21:59:47.805835 | orchestrator | osism.commons.operator : Set authorized GitHub accounts ----------------- 0.20s 2025-03-22 21:59:47.806221 | orchestrator | osism.commons.operator : Delete authorized GitHub accounts -------------- 0.17s 2025-03-22 21:59:47.806557 | orchestrator | osism.commons.operator : Delete ssh authorized keys --------------------- 0.17s 2025-03-22 21:59:48.201211 | orchestrator | + osism apply --environment custom facts 2025-03-22 21:59:49.763362 | orchestrator | 2025-03-22 21:59:49 | INFO  | Trying to run play facts in environment custom 2025-03-22 21:59:49.824407 | orchestrator | 2025-03-22 21:59:49 | INFO  | Task 524dd073-abc4-4aa2-880f-e9d1798c09fd (facts) was prepared for execution. 2025-03-22 21:59:53.063320 | orchestrator | 2025-03-22 21:59:49 | INFO  | It takes a moment until task 524dd073-abc4-4aa2-880f-e9d1798c09fd (facts) has been started and output is visible here. 2025-03-22 21:59:53.063463 | orchestrator | 2025-03-22 21:59:53.064601 | orchestrator | PLAY [Copy custom network devices fact] **************************************** 2025-03-22 21:59:53.064640 | orchestrator | 2025-03-22 21:59:53.065072 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-03-22 21:59:53.065572 | orchestrator | Saturday 22 March 2025 21:59:53 +0000 (0:00:00.088) 0:00:00.088 ******** 2025-03-22 21:59:54.509365 | orchestrator | ok: [testbed-manager] 2025-03-22 21:59:54.509556 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:54.509585 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:54.509891 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:54.510243 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:54.510639 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:54.511194 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:54.511452 | orchestrator | 2025-03-22 21:59:54.516144 | orchestrator | TASK [Copy fact file] ********************************************************** 2025-03-22 21:59:55.798790 | orchestrator | Saturday 22 March 2025 21:59:54 +0000 (0:00:01.446) 0:00:01.535 ******** 2025-03-22 21:59:55.798918 | orchestrator | ok: [testbed-manager] 2025-03-22 21:59:55.881591 | orchestrator | changed: [testbed-node-0] 2025-03-22 21:59:55.881668 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:55.881683 | orchestrator | changed: [testbed-node-2] 2025-03-22 21:59:55.881696 | orchestrator | changed: [testbed-node-1] 2025-03-22 21:59:55.881709 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:55.881721 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:55.881735 | orchestrator | 2025-03-22 21:59:55.881749 | orchestrator | PLAY [Copy custom ceph devices facts] ****************************************** 2025-03-22 21:59:55.881762 | orchestrator | 2025-03-22 21:59:55.881775 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-03-22 21:59:55.881787 | orchestrator | Saturday 22 March 2025 21:59:55 +0000 (0:00:01.282) 0:00:02.817 ******** 2025-03-22 21:59:55.881840 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:55.881910 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:55.882074 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:55.882100 | orchestrator | 2025-03-22 21:59:55.882450 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-03-22 21:59:55.882646 | orchestrator | Saturday 22 March 2025 21:59:55 +0000 (0:00:00.090) 0:00:02.908 ******** 2025-03-22 21:59:56.012937 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:56.014063 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:56.014918 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:56.016070 | orchestrator | 2025-03-22 21:59:56.017617 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-03-22 21:59:56.018074 | orchestrator | Saturday 22 March 2025 21:59:56 +0000 (0:00:00.131) 0:00:03.040 ******** 2025-03-22 21:59:56.133222 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:56.133943 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:56.135448 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:56.136361 | orchestrator | 2025-03-22 21:59:56.136766 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-03-22 21:59:56.137725 | orchestrator | Saturday 22 March 2025 21:59:56 +0000 (0:00:00.119) 0:00:03.159 ******** 2025-03-22 21:59:56.268805 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 21:59:56.269475 | orchestrator | 2025-03-22 21:59:56.269744 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-03-22 21:59:56.270390 | orchestrator | Saturday 22 March 2025 21:59:56 +0000 (0:00:00.135) 0:00:03.295 ******** 2025-03-22 21:59:56.729158 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:56.729774 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:56.733785 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:56.848067 | orchestrator | 2025-03-22 21:59:56.848128 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-03-22 21:59:56.848145 | orchestrator | Saturday 22 March 2025 21:59:56 +0000 (0:00:00.460) 0:00:03.755 ******** 2025-03-22 21:59:56.848170 | orchestrator | skipping: [testbed-node-3] 2025-03-22 21:59:56.849057 | orchestrator | skipping: [testbed-node-4] 2025-03-22 21:59:56.849154 | orchestrator | skipping: [testbed-node-5] 2025-03-22 21:59:56.849653 | orchestrator | 2025-03-22 21:59:56.850015 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-03-22 21:59:56.850411 | orchestrator | Saturday 22 March 2025 21:59:56 +0000 (0:00:00.119) 0:00:03.875 ******** 2025-03-22 21:59:57.949314 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:57.949582 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:57.949687 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:57.949707 | orchestrator | 2025-03-22 21:59:57.949728 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-03-22 21:59:57.949961 | orchestrator | Saturday 22 March 2025 21:59:57 +0000 (0:00:01.100) 0:00:04.975 ******** 2025-03-22 21:59:58.480770 | orchestrator | ok: [testbed-node-3] 2025-03-22 21:59:58.480918 | orchestrator | ok: [testbed-node-4] 2025-03-22 21:59:58.481016 | orchestrator | ok: [testbed-node-5] 2025-03-22 21:59:58.481646 | orchestrator | 2025-03-22 21:59:58.481878 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-03-22 21:59:58.483210 | orchestrator | Saturday 22 March 2025 21:59:58 +0000 (0:00:00.528) 0:00:05.504 ******** 2025-03-22 21:59:59.617117 | orchestrator | changed: [testbed-node-3] 2025-03-22 21:59:59.617655 | orchestrator | changed: [testbed-node-4] 2025-03-22 21:59:59.617682 | orchestrator | changed: [testbed-node-5] 2025-03-22 21:59:59.617694 | orchestrator | 2025-03-22 21:59:59.617705 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-03-22 21:59:59.617722 | orchestrator | Saturday 22 March 2025 21:59:59 +0000 (0:00:01.132) 0:00:06.636 ******** 2025-03-22 22:00:14.446831 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:14.447074 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:14.447107 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:14.447122 | orchestrator | 2025-03-22 22:00:14.447144 | orchestrator | TASK [Install required packages (RedHat)] ************************************** 2025-03-22 22:00:14.447308 | orchestrator | Saturday 22 March 2025 22:00:14 +0000 (0:00:14.815) 0:00:21.451 ******** 2025-03-22 22:00:14.482083 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:00:14.517153 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:00:14.517289 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:00:14.517848 | orchestrator | 2025-03-22 22:00:14.518165 | orchestrator | TASK [Install required packages (Debian)] ************************************** 2025-03-22 22:00:14.518753 | orchestrator | Saturday 22 March 2025 22:00:14 +0000 (0:00:00.093) 0:00:21.545 ******** 2025-03-22 22:00:22.444792 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:22.444967 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:22.444993 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:22.445520 | orchestrator | 2025-03-22 22:00:22.445753 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-03-22 22:00:22.445781 | orchestrator | Saturday 22 March 2025 22:00:22 +0000 (0:00:07.925) 0:00:29.470 ******** 2025-03-22 22:00:22.893342 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:22.893558 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:22.893844 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:22.893875 | orchestrator | 2025-03-22 22:00:22.894088 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-03-22 22:00:22.895180 | orchestrator | Saturday 22 March 2025 22:00:22 +0000 (0:00:00.450) 0:00:29.920 ******** 2025-03-22 22:00:26.486282 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices) 2025-03-22 22:00:26.486747 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices) 2025-03-22 22:00:26.486785 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices) 2025-03-22 22:00:26.488317 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices_all) 2025-03-22 22:00:26.488537 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices_all) 2025-03-22 22:00:26.488845 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices_all) 2025-03-22 22:00:26.489864 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices) 2025-03-22 22:00:26.490520 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices) 2025-03-22 22:00:26.491165 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices) 2025-03-22 22:00:26.491647 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices_all) 2025-03-22 22:00:26.492262 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices_all) 2025-03-22 22:00:26.492612 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices_all) 2025-03-22 22:00:26.493440 | orchestrator | 2025-03-22 22:00:26.494103 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-03-22 22:00:26.495057 | orchestrator | Saturday 22 March 2025 22:00:26 +0000 (0:00:03.590) 0:00:33.510 ******** 2025-03-22 22:00:27.640798 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:27.641302 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:27.641725 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:27.642468 | orchestrator | 2025-03-22 22:00:27.643213 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-22 22:00:27.643888 | orchestrator | 2025-03-22 22:00:27.644623 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-22 22:00:27.644858 | orchestrator | Saturday 22 March 2025 22:00:27 +0000 (0:00:01.154) 0:00:34.665 ******** 2025-03-22 22:00:31.527948 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:31.528123 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:31.528142 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:31.528161 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:31.528352 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:31.528742 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:31.529011 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:31.529440 | orchestrator | 2025-03-22 22:00:31.529795 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:00:31.530097 | orchestrator | 2025-03-22 22:00:31 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:00:31.530152 | orchestrator | 2025-03-22 22:00:31 | INFO  | Please wait and do not abort execution. 2025-03-22 22:00:31.530537 | orchestrator | testbed-manager : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:00:31.530788 | orchestrator | testbed-node-0 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:00:31.531183 | orchestrator | testbed-node-1 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:00:31.531439 | orchestrator | testbed-node-2 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:00:31.531737 | orchestrator | testbed-node-3 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:00:31.532153 | orchestrator | testbed-node-4 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:00:31.532494 | orchestrator | testbed-node-5 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:00:31.532707 | orchestrator | 2025-03-22 22:00:31.533435 | orchestrator | Saturday 22 March 2025 22:00:31 +0000 (0:00:03.890) 0:00:38.555 ******** 2025-03-22 22:00:31.533825 | orchestrator | =============================================================================== 2025-03-22 22:00:31.534401 | orchestrator | osism.commons.repository : Update package cache ------------------------ 14.82s 2025-03-22 22:00:31.534438 | orchestrator | Install required packages (Debian) -------------------------------------- 7.93s 2025-03-22 22:00:31.534976 | orchestrator | Gathers facts about hosts ----------------------------------------------- 3.89s 2025-03-22 22:00:31.535437 | orchestrator | Copy fact files --------------------------------------------------------- 3.59s 2025-03-22 22:00:31.535646 | orchestrator | Create custom facts directory ------------------------------------------- 1.45s 2025-03-22 22:00:31.535932 | orchestrator | Copy fact file ---------------------------------------------------------- 1.28s 2025-03-22 22:00:31.536395 | orchestrator | osism.commons.repository : Force update of package cache ---------------- 1.15s 2025-03-22 22:00:31.536652 | orchestrator | osism.commons.repository : Copy ubuntu.sources file --------------------- 1.13s 2025-03-22 22:00:31.537099 | orchestrator | osism.commons.repository : Copy 99osism apt configuration --------------- 1.10s 2025-03-22 22:00:31.537373 | orchestrator | osism.commons.repository : Remove sources.list file --------------------- 0.53s 2025-03-22 22:00:31.537713 | orchestrator | osism.commons.repository : Create /etc/apt/sources.list.d directory ----- 0.46s 2025-03-22 22:00:31.538054 | orchestrator | Create custom facts directory ------------------------------------------- 0.45s 2025-03-22 22:00:31.538381 | orchestrator | osism.commons.repository : Include distribution specific repository tasks --- 0.14s 2025-03-22 22:00:31.538682 | orchestrator | osism.commons.repository : Set repository_default fact to default value --- 0.13s 2025-03-22 22:00:31.539100 | orchestrator | osism.commons.repository : Include tasks for Ubuntu < 24.04 ------------- 0.12s 2025-03-22 22:00:31.539402 | orchestrator | osism.commons.repository : Set repositories to default ------------------ 0.12s 2025-03-22 22:00:31.539729 | orchestrator | Install required packages (RedHat) -------------------------------------- 0.09s 2025-03-22 22:00:31.540130 | orchestrator | osism.commons.repository : Gather variables for each operating system --- 0.09s 2025-03-22 22:00:31.909724 | orchestrator | + osism apply bootstrap 2025-03-22 22:00:33.572305 | orchestrator | 2025-03-22 22:00:33 | INFO  | Task 73ea0614-e955-49c3-8e31-b29afd4b35e3 (bootstrap) was prepared for execution. 2025-03-22 22:00:37.598777 | orchestrator | 2025-03-22 22:00:33 | INFO  | It takes a moment until task 73ea0614-e955-49c3-8e31-b29afd4b35e3 (bootstrap) has been started and output is visible here. 2025-03-22 22:00:37.598898 | orchestrator | 2025-03-22 22:00:37.600655 | orchestrator | PLAY [Group hosts based on state bootstrap] ************************************ 2025-03-22 22:00:37.602590 | orchestrator | 2025-03-22 22:00:37.602632 | orchestrator | TASK [Group hosts based on state bootstrap] ************************************ 2025-03-22 22:00:37.602649 | orchestrator | Saturday 22 March 2025 22:00:37 +0000 (0:00:00.113) 0:00:00.113 ******** 2025-03-22 22:00:37.673302 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:37.703916 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:37.729595 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:37.759393 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:37.844505 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:37.845667 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:37.845747 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:37.846841 | orchestrator | 2025-03-22 22:00:37.847918 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-22 22:00:37.848544 | orchestrator | 2025-03-22 22:00:37.849709 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-22 22:00:37.850516 | orchestrator | Saturday 22 March 2025 22:00:37 +0000 (0:00:00.248) 0:00:00.361 ******** 2025-03-22 22:00:41.412066 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:41.412392 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:41.412434 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:41.412808 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:41.413495 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:41.413712 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:41.414201 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:41.414600 | orchestrator | 2025-03-22 22:00:41.414817 | orchestrator | PLAY [Gather facts for all hosts (if using --limit)] *************************** 2025-03-22 22:00:41.416465 | orchestrator | 2025-03-22 22:00:41.416826 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-22 22:00:41.419116 | orchestrator | Saturday 22 March 2025 22:00:41 +0000 (0:00:03.565) 0:00:03.927 ******** 2025-03-22 22:00:41.530417 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-03-22 22:00:41.530951 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-03-22 22:00:41.531573 | orchestrator | skipping: [testbed-node-3] => (item=testbed-manager)  2025-03-22 22:00:41.532105 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-03-22 22:00:41.532600 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-03-22 22:00:41.580813 | orchestrator | skipping: [testbed-node-4] => (item=testbed-manager)  2025-03-22 22:00:41.581075 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-03-22 22:00:41.581691 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-03-22 22:00:41.582130 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-3)  2025-03-22 22:00:41.582635 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-03-22 22:00:41.660634 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-03-22 22:00:41.661416 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-03-22 22:00:41.662551 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-4)  2025-03-22 22:00:41.663630 | orchestrator | skipping: [testbed-node-5] => (item=testbed-manager)  2025-03-22 22:00:41.664094 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-5)  2025-03-22 22:00:41.665883 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-03-22 22:00:41.666581 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-03-22 22:00:41.668105 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-03-22 22:00:41.668630 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-03-22 22:00:41.668950 | orchestrator | skipping: [testbed-node-0] => (item=testbed-manager)  2025-03-22 22:00:41.991231 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-3)  2025-03-22 22:00:41.991744 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:00:41.992392 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-03-22 22:00:41.993152 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-03-22 22:00:41.993770 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-4)  2025-03-22 22:00:41.994354 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:00:41.995047 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-03-22 22:00:41.995602 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:00:41.996141 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-5)  2025-03-22 22:00:41.997031 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2025-03-22 22:00:41.997688 | orchestrator | skipping: [testbed-node-1] => (item=testbed-manager)  2025-03-22 22:00:41.998101 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2025-03-22 22:00:41.998645 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-03-22 22:00:41.999405 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-3)  2025-03-22 22:00:42.000730 | orchestrator | skipping: [testbed-node-2] => (item=testbed-manager)  2025-03-22 22:00:42.000791 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-03-22 22:00:42.001181 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2025-03-22 22:00:42.002131 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-03-22 22:00:42.002688 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:00:42.003278 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-03-22 22:00:42.003973 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-4)  2025-03-22 22:00:42.004592 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-3)  2025-03-22 22:00:42.004982 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-5)  2025-03-22 22:00:42.005782 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-03-22 22:00:42.006464 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-4)  2025-03-22 22:00:42.007865 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2025-03-22 22:00:42.008108 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-03-22 22:00:42.008937 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-5)  2025-03-22 22:00:42.008965 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:00:42.012447 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2025-03-22 22:00:42.012824 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2025-03-22 22:00:42.012846 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2025-03-22 22:00:42.012860 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:00:42.012873 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2025-03-22 22:00:42.012886 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2025-03-22 22:00:42.012902 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:00:42.013696 | orchestrator | 2025-03-22 22:00:42.017428 | orchestrator | PLAY [Apply bootstrap roles part 1] ******************************************** 2025-03-22 22:00:42.062459 | orchestrator | 2025-03-22 22:00:42.062528 | orchestrator | TASK [osism.commons.hostname : Set hostname_name fact] ************************* 2025-03-22 22:00:42.062543 | orchestrator | Saturday 22 March 2025 22:00:41 +0000 (0:00:00.580) 0:00:04.507 ******** 2025-03-22 22:00:42.062563 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:42.127050 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:42.150247 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:42.181380 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:42.234670 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:42.234961 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:42.235579 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:42.236890 | orchestrator | 2025-03-22 22:00:42.237152 | orchestrator | TASK [osism.commons.hostname : Set hostname] *********************************** 2025-03-22 22:00:42.237668 | orchestrator | Saturday 22 March 2025 22:00:42 +0000 (0:00:00.243) 0:00:04.751 ******** 2025-03-22 22:00:43.662235 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:43.663062 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:43.663844 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:43.664326 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:43.665323 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:43.666097 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:43.667285 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:43.667606 | orchestrator | 2025-03-22 22:00:43.668227 | orchestrator | TASK [osism.commons.hostname : Copy /etc/hostname] ***************************** 2025-03-22 22:00:43.668896 | orchestrator | Saturday 22 March 2025 22:00:43 +0000 (0:00:01.425) 0:00:06.176 ******** 2025-03-22 22:00:45.091941 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:45.092120 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:45.092505 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:45.092882 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:45.093745 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:45.094112 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:45.094340 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:45.094734 | orchestrator | 2025-03-22 22:00:45.095144 | orchestrator | TASK [osism.commons.hosts : Include type specific tasks] *********************** 2025-03-22 22:00:45.095527 | orchestrator | Saturday 22 March 2025 22:00:45 +0000 (0:00:01.429) 0:00:07.606 ******** 2025-03-22 22:00:45.492901 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/hosts/tasks/type-template.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:00:45.493111 | orchestrator | 2025-03-22 22:00:45.493169 | orchestrator | TASK [osism.commons.hosts : Copy /etc/hosts file] ****************************** 2025-03-22 22:00:47.810095 | orchestrator | Saturday 22 March 2025 22:00:45 +0000 (0:00:00.398) 0:00:08.005 ******** 2025-03-22 22:00:47.810227 | orchestrator | changed: [testbed-manager] 2025-03-22 22:00:47.810295 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:00:47.811062 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:47.812663 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:47.813932 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:47.814197 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:00:47.814736 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:00:47.815222 | orchestrator | 2025-03-22 22:00:47.815945 | orchestrator | TASK [osism.commons.proxy : Include distribution specific tasks] *************** 2025-03-22 22:00:47.816244 | orchestrator | Saturday 22 March 2025 22:00:47 +0000 (0:00:02.316) 0:00:10.322 ******** 2025-03-22 22:00:47.914942 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:00:48.137212 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/proxy/tasks/Debian-family.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:00:48.137642 | orchestrator | 2025-03-22 22:00:48.137950 | orchestrator | TASK [osism.commons.proxy : Configure proxy parameters for apt] **************** 2025-03-22 22:00:48.138885 | orchestrator | Saturday 22 March 2025 22:00:48 +0000 (0:00:00.331) 0:00:10.653 ******** 2025-03-22 22:00:49.233846 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:49.279194 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:49.964787 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:49.964877 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:00:49.964887 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:00:49.964896 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:00:49.964904 | orchestrator | 2025-03-22 22:00:49.964913 | orchestrator | TASK [osism.commons.proxy : Set system wide settings in environment file] ****** 2025-03-22 22:00:49.964922 | orchestrator | Saturday 22 March 2025 22:00:49 +0000 (0:00:01.096) 0:00:11.749 ******** 2025-03-22 22:00:49.964930 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:00:49.964949 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:00:49.964992 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:00:49.965802 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:00:49.966395 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:49.966565 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:49.966978 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:49.967245 | orchestrator | 2025-03-22 22:00:49.967558 | orchestrator | TASK [osism.commons.proxy : Remove system wide settings in environment file] *** 2025-03-22 22:00:49.967842 | orchestrator | Saturday 22 March 2025 22:00:49 +0000 (0:00:00.729) 0:00:12.479 ******** 2025-03-22 22:00:50.059582 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:00:50.082380 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:00:50.109546 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:00:50.401578 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:00:50.401741 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:00:50.401764 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:00:50.401785 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:50.402341 | orchestrator | 2025-03-22 22:00:50.402559 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-03-22 22:00:50.402783 | orchestrator | Saturday 22 March 2025 22:00:50 +0000 (0:00:00.438) 0:00:12.918 ******** 2025-03-22 22:00:50.471102 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:00:50.500401 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:00:50.521711 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:00:50.548388 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:00:50.597298 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:00:50.598339 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:00:50.601749 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:00:50.602442 | orchestrator | 2025-03-22 22:00:50.603227 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-03-22 22:00:50.603885 | orchestrator | Saturday 22 March 2025 22:00:50 +0000 (0:00:00.195) 0:00:13.114 ******** 2025-03-22 22:00:50.873879 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:00:50.874604 | orchestrator | 2025-03-22 22:00:50.875520 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-03-22 22:00:50.876553 | orchestrator | Saturday 22 March 2025 22:00:50 +0000 (0:00:00.276) 0:00:13.390 ******** 2025-03-22 22:00:51.176211 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:00:51.176367 | orchestrator | 2025-03-22 22:00:51.177210 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-03-22 22:00:51.177441 | orchestrator | Saturday 22 March 2025 22:00:51 +0000 (0:00:00.302) 0:00:13.692 ******** 2025-03-22 22:00:52.604303 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:52.604856 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:52.605488 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:52.606550 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:52.607635 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:52.608292 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:52.609396 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:52.610075 | orchestrator | 2025-03-22 22:00:52.610773 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-03-22 22:00:52.611352 | orchestrator | Saturday 22 March 2025 22:00:52 +0000 (0:00:01.426) 0:00:15.118 ******** 2025-03-22 22:00:52.668353 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:00:52.695585 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:00:52.717814 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:00:52.746248 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:00:52.804901 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:00:52.805973 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:00:52.807129 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:00:52.808372 | orchestrator | 2025-03-22 22:00:52.808679 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-03-22 22:00:52.809668 | orchestrator | Saturday 22 March 2025 22:00:52 +0000 (0:00:00.202) 0:00:15.321 ******** 2025-03-22 22:00:53.356791 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:53.357604 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:53.358695 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:53.359386 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:53.360221 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:53.360621 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:53.361511 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:53.362426 | orchestrator | 2025-03-22 22:00:53.363135 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-03-22 22:00:53.363765 | orchestrator | Saturday 22 March 2025 22:00:53 +0000 (0:00:00.549) 0:00:15.870 ******** 2025-03-22 22:00:53.433631 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:00:53.465403 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:00:53.490852 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:00:53.516576 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:00:53.590583 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:00:53.590703 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:00:53.590725 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:00:53.590745 | orchestrator | 2025-03-22 22:00:53.590835 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-03-22 22:00:53.591327 | orchestrator | Saturday 22 March 2025 22:00:53 +0000 (0:00:00.234) 0:00:16.105 ******** 2025-03-22 22:00:54.137940 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:54.138161 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:54.138444 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:54.139026 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:54.139229 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:00:54.139847 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:00:54.140222 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:00:54.140621 | orchestrator | 2025-03-22 22:00:54.141019 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-03-22 22:00:54.141444 | orchestrator | Saturday 22 March 2025 22:00:54 +0000 (0:00:00.548) 0:00:16.654 ******** 2025-03-22 22:00:55.282346 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:55.282627 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:55.283060 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:55.283300 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:00:55.284340 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:55.287420 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:00:55.287531 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:00:55.287844 | orchestrator | 2025-03-22 22:00:55.288218 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-03-22 22:00:55.288532 | orchestrator | Saturday 22 March 2025 22:00:55 +0000 (0:00:01.142) 0:00:17.796 ******** 2025-03-22 22:00:56.703117 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:56.703653 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:56.703807 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:56.704191 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:56.705364 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:56.705735 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:56.705759 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:56.705777 | orchestrator | 2025-03-22 22:00:56.706323 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-03-22 22:00:56.706454 | orchestrator | Saturday 22 March 2025 22:00:56 +0000 (0:00:01.420) 0:00:19.216 ******** 2025-03-22 22:00:57.055182 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:00:57.055828 | orchestrator | 2025-03-22 22:00:57.056884 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-03-22 22:00:57.057375 | orchestrator | Saturday 22 March 2025 22:00:57 +0000 (0:00:00.352) 0:00:19.569 ******** 2025-03-22 22:00:57.142297 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:00:58.765724 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:00:58.766184 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:00:58.767171 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:00:58.767260 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:00:58.768502 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:00:58.769520 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:00:58.770274 | orchestrator | 2025-03-22 22:00:58.770308 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-03-22 22:00:58.770990 | orchestrator | Saturday 22 March 2025 22:00:58 +0000 (0:00:01.710) 0:00:21.279 ******** 2025-03-22 22:00:58.861717 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:58.891342 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:58.928992 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:58.958091 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:59.029418 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:59.031233 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:59.032283 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:59.033340 | orchestrator | 2025-03-22 22:00:59.034381 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-03-22 22:00:59.034821 | orchestrator | Saturday 22 March 2025 22:00:59 +0000 (0:00:00.264) 0:00:21.544 ******** 2025-03-22 22:00:59.129825 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:59.163661 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:59.199114 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:59.244732 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:59.336251 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:59.336481 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:59.337326 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:59.338638 | orchestrator | 2025-03-22 22:00:59.339568 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-03-22 22:00:59.339820 | orchestrator | Saturday 22 March 2025 22:00:59 +0000 (0:00:00.306) 0:00:21.851 ******** 2025-03-22 22:00:59.434599 | orchestrator | ok: [testbed-manager] 2025-03-22 22:00:59.463147 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:00:59.499618 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:00:59.529843 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:00:59.605550 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:00:59.606688 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:00:59.607862 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:00:59.608792 | orchestrator | 2025-03-22 22:00:59.609606 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-03-22 22:00:59.610550 | orchestrator | Saturday 22 March 2025 22:00:59 +0000 (0:00:00.269) 0:00:22.120 ******** 2025-03-22 22:00:59.962009 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:00:59.962933 | orchestrator | 2025-03-22 22:00:59.963514 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-03-22 22:00:59.964247 | orchestrator | Saturday 22 March 2025 22:00:59 +0000 (0:00:00.357) 0:00:22.477 ******** 2025-03-22 22:01:00.594256 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:00.595335 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:00.595954 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:00.596320 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:00.597324 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:00.597741 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:00.598202 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:00.598788 | orchestrator | 2025-03-22 22:01:00.599331 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-03-22 22:01:00.599781 | orchestrator | Saturday 22 March 2025 22:01:00 +0000 (0:00:00.631) 0:00:23.109 ******** 2025-03-22 22:01:00.677479 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:01:00.714760 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:01:00.742301 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:01:00.777132 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:01:00.851795 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:01:00.852168 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:01:00.853103 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:01:00.853326 | orchestrator | 2025-03-22 22:01:00.854252 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-03-22 22:01:00.857495 | orchestrator | Saturday 22 March 2025 22:01:00 +0000 (0:00:00.257) 0:00:23.367 ******** 2025-03-22 22:01:01.981548 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:01.981710 | orchestrator | changed: [testbed-manager] 2025-03-22 22:01:01.981957 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:01.982698 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:01.982771 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:01.983417 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:01.983758 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:01.985786 | orchestrator | 2025-03-22 22:01:01.985855 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-03-22 22:01:01.987366 | orchestrator | Saturday 22 March 2025 22:01:01 +0000 (0:00:01.129) 0:00:24.496 ******** 2025-03-22 22:01:02.609722 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:02.609866 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:02.610099 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:02.610480 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:02.612244 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:02.612709 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:02.614314 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:03.944094 | orchestrator | 2025-03-22 22:01:03.944251 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-03-22 22:01:03.944273 | orchestrator | Saturday 22 March 2025 22:01:02 +0000 (0:00:00.627) 0:00:25.123 ******** 2025-03-22 22:01:03.944305 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:03.944384 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:03.945162 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:03.945576 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:03.946708 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:03.947002 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:03.948283 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:03.948451 | orchestrator | 2025-03-22 22:01:03.949242 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-03-22 22:01:03.949580 | orchestrator | Saturday 22 March 2025 22:01:03 +0000 (0:00:01.334) 0:00:26.458 ******** 2025-03-22 22:01:18.221591 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:18.226152 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:18.226204 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:18.226221 | orchestrator | changed: [testbed-manager] 2025-03-22 22:01:18.226237 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:18.226261 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:18.227141 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:18.227177 | orchestrator | 2025-03-22 22:01:18.227924 | orchestrator | TASK [osism.services.rsyslog : Gather variables for each operating system] ***** 2025-03-22 22:01:18.230226 | orchestrator | Saturday 22 March 2025 22:01:18 +0000 (0:00:14.273) 0:00:40.732 ******** 2025-03-22 22:01:18.321602 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:18.357954 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:18.400331 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:18.428938 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:18.507409 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:18.508422 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:18.509067 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:18.509541 | orchestrator | 2025-03-22 22:01:18.509887 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_user variable to default value] ***** 2025-03-22 22:01:18.510569 | orchestrator | Saturday 22 March 2025 22:01:18 +0000 (0:00:00.290) 0:00:41.023 ******** 2025-03-22 22:01:18.596541 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:18.630152 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:18.665240 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:18.697180 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:18.781798 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:18.781930 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:18.783047 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:18.783931 | orchestrator | 2025-03-22 22:01:18.784384 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_workdir variable to default value] *** 2025-03-22 22:01:18.784803 | orchestrator | Saturday 22 March 2025 22:01:18 +0000 (0:00:00.272) 0:00:41.296 ******** 2025-03-22 22:01:18.882729 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:18.919048 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:18.948432 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:18.983296 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:19.065815 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:19.067620 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:19.067687 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:19.068418 | orchestrator | 2025-03-22 22:01:19.068476 | orchestrator | TASK [osism.services.rsyslog : Include distribution specific install tasks] **** 2025-03-22 22:01:19.068666 | orchestrator | Saturday 22 March 2025 22:01:19 +0000 (0:00:00.282) 0:00:41.579 ******** 2025-03-22 22:01:19.455579 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:01:19.455837 | orchestrator | 2025-03-22 22:01:19.456326 | orchestrator | TASK [osism.services.rsyslog : Install rsyslog package] ************************ 2025-03-22 22:01:19.456362 | orchestrator | Saturday 22 March 2025 22:01:19 +0000 (0:00:00.390) 0:00:41.970 ******** 2025-03-22 22:01:21.118280 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:21.118842 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:21.119470 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:21.120164 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:21.120868 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:21.121439 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:21.122181 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:21.122742 | orchestrator | 2025-03-22 22:01:21.123214 | orchestrator | TASK [osism.services.rsyslog : Copy rsyslog.conf configuration file] *********** 2025-03-22 22:01:21.123655 | orchestrator | Saturday 22 March 2025 22:01:21 +0000 (0:00:01.661) 0:00:43.631 ******** 2025-03-22 22:01:22.236984 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:01:22.237790 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:01:22.237825 | orchestrator | changed: [testbed-manager] 2025-03-22 22:01:22.237850 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:01:22.237924 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:22.238122 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:22.238152 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:22.239351 | orchestrator | 2025-03-22 22:01:22.239570 | orchestrator | TASK [osism.services.rsyslog : Manage rsyslog service] ************************* 2025-03-22 22:01:22.239600 | orchestrator | Saturday 22 March 2025 22:01:22 +0000 (0:00:01.119) 0:00:44.751 ******** 2025-03-22 22:01:23.203493 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:23.203935 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:23.203973 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:23.204699 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:23.205096 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:23.205794 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:23.206544 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:23.206863 | orchestrator | 2025-03-22 22:01:23.207467 | orchestrator | TASK [osism.services.rsyslog : Include fluentd tasks] ************************** 2025-03-22 22:01:23.207818 | orchestrator | Saturday 22 March 2025 22:01:23 +0000 (0:00:00.966) 0:00:45.718 ******** 2025-03-22 22:01:23.538921 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/fluentd.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:01:23.539579 | orchestrator | 2025-03-22 22:01:23.539622 | orchestrator | TASK [osism.services.rsyslog : Forward syslog message to local fluentd daemon] *** 2025-03-22 22:01:23.541599 | orchestrator | Saturday 22 March 2025 22:01:23 +0000 (0:00:00.335) 0:00:46.054 ******** 2025-03-22 22:01:24.630289 | orchestrator | changed: [testbed-manager] 2025-03-22 22:01:24.630685 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:01:24.631322 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:01:24.632138 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:01:24.634410 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:24.634474 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:24.635053 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:24.635084 | orchestrator | 2025-03-22 22:01:24.635876 | orchestrator | TASK [osism.services.rsyslog : Include additional log server tasks] ************ 2025-03-22 22:01:24.722721 | orchestrator | Saturday 22 March 2025 22:01:24 +0000 (0:00:01.090) 0:00:47.144 ******** 2025-03-22 22:01:24.722800 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:01:24.756693 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:01:24.790102 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:01:24.823646 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:01:24.997746 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:01:24.998106 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:01:24.998143 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:01:24.998186 | orchestrator | 2025-03-22 22:01:24.998212 | orchestrator | TASK [osism.commons.systohc : Install util-linux-extra package] **************** 2025-03-22 22:01:24.998773 | orchestrator | Saturday 22 March 2025 22:01:24 +0000 (0:00:00.367) 0:00:47.512 ******** 2025-03-22 22:01:36.809900 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:01:36.810215 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:01:36.810816 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:36.810848 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:01:36.810886 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:36.810901 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:36.810927 | orchestrator | changed: [testbed-manager] 2025-03-22 22:01:36.811003 | orchestrator | 2025-03-22 22:01:36.811077 | orchestrator | TASK [osism.commons.systohc : Sync hardware clock] ***************************** 2025-03-22 22:01:36.811299 | orchestrator | Saturday 22 March 2025 22:01:36 +0000 (0:00:11.810) 0:00:59.322 ******** 2025-03-22 22:01:37.657782 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:37.658110 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:37.659166 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:37.660928 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:37.661836 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:37.662649 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:37.663885 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:37.664225 | orchestrator | 2025-03-22 22:01:37.665210 | orchestrator | TASK [osism.commons.configfs : Start sys-kernel-config mount] ****************** 2025-03-22 22:01:37.665558 | orchestrator | Saturday 22 March 2025 22:01:37 +0000 (0:00:00.851) 0:01:00.174 ******** 2025-03-22 22:01:38.708311 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:38.708430 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:38.708639 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:38.708671 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:38.709326 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:38.709629 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:38.709995 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:38.710346 | orchestrator | 2025-03-22 22:01:38.711076 | orchestrator | TASK [osism.commons.packages : Gather variables for each operating system] ***** 2025-03-22 22:01:38.711513 | orchestrator | Saturday 22 March 2025 22:01:38 +0000 (0:00:01.047) 0:01:01.222 ******** 2025-03-22 22:01:38.774786 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:38.799428 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:38.835145 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:38.863359 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:38.934096 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:38.934253 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:38.935024 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:38.935126 | orchestrator | 2025-03-22 22:01:38.935153 | orchestrator | TASK [osism.commons.packages : Set required_packages_distribution variable to default value] *** 2025-03-22 22:01:38.935325 | orchestrator | Saturday 22 March 2025 22:01:38 +0000 (0:00:00.227) 0:01:01.450 ******** 2025-03-22 22:01:39.007968 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:39.043396 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:39.077235 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:39.114065 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:39.192875 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:39.193004 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:39.193258 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:39.194122 | orchestrator | 2025-03-22 22:01:39.194642 | orchestrator | TASK [osism.commons.packages : Include distribution specific package tasks] **** 2025-03-22 22:01:39.195373 | orchestrator | Saturday 22 March 2025 22:01:39 +0000 (0:00:00.257) 0:01:01.707 ******** 2025-03-22 22:01:39.561779 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/packages/tasks/package-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:01:39.562103 | orchestrator | 2025-03-22 22:01:39.563296 | orchestrator | TASK [osism.commons.packages : Install needrestart package] ******************** 2025-03-22 22:01:39.564058 | orchestrator | Saturday 22 March 2025 22:01:39 +0000 (0:00:00.368) 0:01:02.075 ******** 2025-03-22 22:01:41.237928 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:41.238201 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:41.238392 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:41.238534 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:41.238585 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:41.238810 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:41.239878 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:41.240021 | orchestrator | 2025-03-22 22:01:41.241816 | orchestrator | TASK [osism.commons.packages : Set needrestart mode] *************************** 2025-03-22 22:01:41.882942 | orchestrator | Saturday 22 March 2025 22:01:41 +0000 (0:00:01.675) 0:01:03.751 ******** 2025-03-22 22:01:41.883093 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:01:41.883163 | orchestrator | changed: [testbed-manager] 2025-03-22 22:01:41.883185 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:01:41.883607 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:41.884037 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:41.884517 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:41.884894 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:01:41.885363 | orchestrator | 2025-03-22 22:01:41.886246 | orchestrator | TASK [osism.commons.packages : Set apt_cache_valid_time variable to default value] *** 2025-03-22 22:01:41.886410 | orchestrator | Saturday 22 March 2025 22:01:41 +0000 (0:00:00.646) 0:01:04.398 ******** 2025-03-22 22:01:41.982595 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:42.017790 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:42.057387 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:42.083862 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:42.153092 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:42.154105 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:42.154504 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:42.154911 | orchestrator | 2025-03-22 22:01:42.155722 | orchestrator | TASK [osism.commons.packages : Update package cache] *************************** 2025-03-22 22:01:42.155891 | orchestrator | Saturday 22 March 2025 22:01:42 +0000 (0:00:00.270) 0:01:04.668 ******** 2025-03-22 22:01:43.462511 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:43.463052 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:43.551324 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:45.265766 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:45.265860 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:45.265877 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:45.265893 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:45.265908 | orchestrator | 2025-03-22 22:01:45.265925 | orchestrator | TASK [osism.commons.packages : Download upgrade packages] ********************** 2025-03-22 22:01:45.265941 | orchestrator | Saturday 22 March 2025 22:01:43 +0000 (0:00:01.308) 0:01:05.976 ******** 2025-03-22 22:01:45.265967 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:01:45.266080 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:01:45.266477 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:01:45.266730 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:01:45.267196 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:01:45.267906 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:01:45.268324 | orchestrator | changed: [testbed-manager] 2025-03-22 22:01:45.268774 | orchestrator | 2025-03-22 22:01:45.268906 | orchestrator | TASK [osism.commons.packages : Upgrade packages] ******************************* 2025-03-22 22:01:45.269291 | orchestrator | Saturday 22 March 2025 22:01:45 +0000 (0:00:01.804) 0:01:07.780 ******** 2025-03-22 22:01:55.662745 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:01:55.663808 | orchestrator | ok: [testbed-manager] 2025-03-22 22:01:55.663841 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:01:55.663856 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:01:55.663870 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:01:55.663891 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:01:55.665477 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:01:55.665848 | orchestrator | 2025-03-22 22:01:55.666226 | orchestrator | TASK [osism.commons.packages : Download required packages] ********************* 2025-03-22 22:01:55.666654 | orchestrator | Saturday 22 March 2025 22:01:55 +0000 (0:00:10.394) 0:01:18.175 ******** 2025-03-22 22:02:32.980823 | orchestrator | ok: [testbed-manager] 2025-03-22 22:02:32.981005 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:02:32.981034 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:02:32.983088 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:02:32.984343 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:02:32.986004 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:02:32.986592 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:02:32.987563 | orchestrator | 2025-03-22 22:02:32.988494 | orchestrator | TASK [osism.commons.packages : Install required packages] ********************** 2025-03-22 22:02:32.989361 | orchestrator | Saturday 22 March 2025 22:02:32 +0000 (0:00:37.317) 0:01:55.492 ******** 2025-03-22 22:03:45.825449 | orchestrator | changed: [testbed-manager] 2025-03-22 22:03:45.825842 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:03:45.826196 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:03:45.826278 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:03:45.828192 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:03:45.829662 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:03:45.830700 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:03:45.831323 | orchestrator | 2025-03-22 22:03:45.831834 | orchestrator | TASK [osism.commons.packages : Remove useless packages from the cache] ********* 2025-03-22 22:03:45.833257 | orchestrator | Saturday 22 March 2025 22:03:45 +0000 (0:01:12.845) 0:03:08.338 ******** 2025-03-22 22:03:47.680962 | orchestrator | ok: [testbed-manager] 2025-03-22 22:03:47.682143 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:03:47.682184 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:03:47.683262 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:03:47.683672 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:03:47.684969 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:03:47.685488 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:03:47.686620 | orchestrator | 2025-03-22 22:03:47.687762 | orchestrator | TASK [osism.commons.packages : Remove dependencies that are no longer required] *** 2025-03-22 22:03:47.687861 | orchestrator | Saturday 22 March 2025 22:03:47 +0000 (0:00:01.854) 0:03:10.193 ******** 2025-03-22 22:04:01.290194 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:01.290325 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:01.290340 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:01.290349 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:01.290377 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:01.290386 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:01.290399 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:01.290941 | orchestrator | 2025-03-22 22:04:01.291296 | orchestrator | TASK [osism.commons.sysctl : Include sysctl tasks] ***************************** 2025-03-22 22:04:01.291538 | orchestrator | Saturday 22 March 2025 22:04:01 +0000 (0:00:13.609) 0:03:23.802 ******** 2025-03-22 22:04:01.734947 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'elasticsearch', 'value': [{'name': 'vm.max_map_count', 'value': 262144}]}) 2025-03-22 22:04:01.736578 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'rabbitmq', 'value': [{'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}, {'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}, {'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}, {'name': 'net.core.wmem_max', 'value': 16777216}, {'name': 'net.core.rmem_max', 'value': 16777216}, {'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}, {'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}, {'name': 'net.core.somaxconn', 'value': 4096}, {'name': 'net.ipv4.tcp_syncookies', 'value': 0}, {'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}]}) 2025-03-22 22:04:01.737652 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'generic', 'value': [{'name': 'vm.swappiness', 'value': 1}]}) 2025-03-22 22:04:01.738667 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'compute', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2025-03-22 22:04:01.739544 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'k3s_node', 'value': [{'name': 'fs.inotify.max_user_instances', 'value': 1024}]}) 2025-03-22 22:04:01.740666 | orchestrator | 2025-03-22 22:04:01.741116 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on elasticsearch] *********** 2025-03-22 22:04:01.741717 | orchestrator | Saturday 22 March 2025 22:04:01 +0000 (0:00:00.447) 0:03:24.249 ******** 2025-03-22 22:04:01.801323 | orchestrator | skipping: [testbed-manager] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-22 22:04:01.834715 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-22 22:04:01.834754 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:01.868699 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-22 22:04:01.868775 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:04:01.869291 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-03-22 22:04:01.909489 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:04:01.950678 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:04:02.527514 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-03-22 22:04:02.529881 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-03-22 22:04:02.530130 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-03-22 22:04:02.530166 | orchestrator | 2025-03-22 22:04:02.530624 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on rabbitmq] **************** 2025-03-22 22:04:02.531254 | orchestrator | Saturday 22 March 2025 22:04:02 +0000 (0:00:00.789) 0:03:25.039 ******** 2025-03-22 22:04:02.618268 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-22 22:04:02.618691 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-22 22:04:02.618790 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-22 22:04:02.619444 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-22 22:04:02.619512 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-22 22:04:02.619905 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-22 22:04:02.620345 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-22 22:04:02.620846 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-22 22:04:02.621005 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-22 22:04:02.621326 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-22 22:04:02.624027 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-22 22:04:02.624846 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-22 22:04:02.624890 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-22 22:04:02.625398 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-22 22:04:02.662968 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-22 22:04:02.665740 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:02.667221 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-22 22:04:02.667259 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-22 22:04:02.667600 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-22 22:04:02.667700 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-22 22:04:02.668000 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-22 22:04:02.668278 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-22 22:04:02.668492 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-22 22:04:02.668733 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-22 22:04:02.715599 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:04:02.716183 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-22 22:04:02.718626 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-22 22:04:02.718907 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-22 22:04:02.719331 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-22 22:04:02.719598 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-22 22:04:02.719918 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-22 22:04:02.720591 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-22 22:04:02.762190 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-03-22 22:04:02.762653 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:04:02.763788 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-03-22 22:04:02.763826 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-03-22 22:04:02.764257 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-03-22 22:04:02.764795 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-03-22 22:04:02.765481 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-03-22 22:04:02.765774 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-03-22 22:04:02.766787 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-03-22 22:04:02.772838 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-03-22 22:04:02.798334 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-03-22 22:04:09.538105 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:04:09.538552 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-03-22 22:04:09.538830 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-03-22 22:04:09.539576 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-03-22 22:04:09.539798 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-03-22 22:04:09.540252 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-03-22 22:04:09.540559 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-03-22 22:04:09.540990 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-03-22 22:04:09.541960 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-03-22 22:04:09.542472 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-03-22 22:04:09.542508 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-03-22 22:04:09.542941 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-03-22 22:04:09.544867 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-03-22 22:04:09.545457 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-03-22 22:04:09.545810 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-03-22 22:04:09.546405 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-03-22 22:04:09.546682 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-03-22 22:04:09.547103 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-03-22 22:04:09.547604 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-03-22 22:04:09.548131 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-03-22 22:04:09.548215 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-03-22 22:04:09.548797 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-03-22 22:04:09.549841 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-03-22 22:04:09.550198 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-03-22 22:04:09.550654 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-03-22 22:04:09.551144 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-03-22 22:04:09.551852 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-03-22 22:04:09.552199 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-03-22 22:04:09.552636 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-03-22 22:04:09.553573 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-03-22 22:04:09.554125 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-03-22 22:04:09.554445 | orchestrator | 2025-03-22 22:04:09.556378 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on generic] ***************** 2025-03-22 22:04:09.556657 | orchestrator | Saturday 22 March 2025 22:04:09 +0000 (0:00:07.013) 0:03:32.052 ******** 2025-03-22 22:04:11.197441 | orchestrator | changed: [testbed-manager] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-22 22:04:11.198768 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-22 22:04:11.198832 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-22 22:04:11.198855 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-22 22:04:11.199412 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-22 22:04:11.200240 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-22 22:04:11.201029 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-03-22 22:04:11.201401 | orchestrator | 2025-03-22 22:04:11.201833 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on compute] ***************** 2025-03-22 22:04:11.202503 | orchestrator | Saturday 22 March 2025 22:04:11 +0000 (0:00:01.655) 0:03:33.708 ******** 2025-03-22 22:04:11.254107 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-22 22:04:11.290584 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:11.382801 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-22 22:04:11.383151 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-22 22:04:11.756645 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:04:11.756832 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:04:11.757766 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-03-22 22:04:11.758081 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:04:11.758752 | orchestrator | changed: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-03-22 22:04:11.759615 | orchestrator | changed: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-03-22 22:04:11.760453 | orchestrator | changed: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-03-22 22:04:11.760742 | orchestrator | 2025-03-22 22:04:11.761632 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on k3s_node] **************** 2025-03-22 22:04:11.762200 | orchestrator | Saturday 22 March 2025 22:04:11 +0000 (0:00:00.563) 0:03:34.272 ******** 2025-03-22 22:04:11.823999 | orchestrator | skipping: [testbed-manager] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-22 22:04:11.866830 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:11.954938 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-22 22:04:12.000821 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:04:12.001372 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-22 22:04:12.470719 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:04:12.471476 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-03-22 22:04:12.475561 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:04:12.479840 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-03-22 22:04:12.480079 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-03-22 22:04:12.480105 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-03-22 22:04:12.480120 | orchestrator | 2025-03-22 22:04:12.480139 | orchestrator | TASK [osism.commons.limits : Include limits tasks] ***************************** 2025-03-22 22:04:12.535786 | orchestrator | Saturday 22 March 2025 22:04:12 +0000 (0:00:00.712) 0:03:34.984 ******** 2025-03-22 22:04:12.535828 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:12.567202 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:04:12.593506 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:04:12.619049 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:04:12.649941 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:04:12.819331 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:04:12.820337 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:04:12.820673 | orchestrator | 2025-03-22 22:04:12.821276 | orchestrator | TASK [osism.commons.services : Populate service facts] ************************* 2025-03-22 22:04:12.824472 | orchestrator | Saturday 22 March 2025 22:04:12 +0000 (0:00:00.349) 0:03:35.334 ******** 2025-03-22 22:04:17.361677 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:17.362641 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:17.363846 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:17.364089 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:17.364571 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:17.365134 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:17.365514 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:17.365943 | orchestrator | 2025-03-22 22:04:17.366287 | orchestrator | TASK [osism.commons.services : Check services] ********************************* 2025-03-22 22:04:17.366665 | orchestrator | Saturday 22 March 2025 22:04:17 +0000 (0:00:04.543) 0:03:39.877 ******** 2025-03-22 22:04:17.426668 | orchestrator | skipping: [testbed-manager] => (item=nscd)  2025-03-22 22:04:17.453262 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:17.453693 | orchestrator | skipping: [testbed-node-3] => (item=nscd)  2025-03-22 22:04:17.454300 | orchestrator | skipping: [testbed-node-4] => (item=nscd)  2025-03-22 22:04:17.484332 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:04:17.485079 | orchestrator | skipping: [testbed-node-5] => (item=nscd)  2025-03-22 22:04:17.512915 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:04:17.513133 | orchestrator | skipping: [testbed-node-0] => (item=nscd)  2025-03-22 22:04:17.542136 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:04:17.542237 | orchestrator | skipping: [testbed-node-1] => (item=nscd)  2025-03-22 22:04:17.600616 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:04:17.601093 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:04:17.601126 | orchestrator | skipping: [testbed-node-2] => (item=nscd)  2025-03-22 22:04:17.602132 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:04:17.602546 | orchestrator | 2025-03-22 22:04:17.603129 | orchestrator | TASK [osism.commons.services : Start/enable required services] ***************** 2025-03-22 22:04:17.604410 | orchestrator | Saturday 22 March 2025 22:04:17 +0000 (0:00:00.240) 0:03:40.118 ******** 2025-03-22 22:04:18.579814 | orchestrator | ok: [testbed-manager] => (item=cron) 2025-03-22 22:04:18.580293 | orchestrator | ok: [testbed-node-3] => (item=cron) 2025-03-22 22:04:18.580648 | orchestrator | ok: [testbed-node-4] => (item=cron) 2025-03-22 22:04:18.580684 | orchestrator | ok: [testbed-node-5] => (item=cron) 2025-03-22 22:04:18.580705 | orchestrator | ok: [testbed-node-0] => (item=cron) 2025-03-22 22:04:18.580915 | orchestrator | ok: [testbed-node-1] => (item=cron) 2025-03-22 22:04:18.581305 | orchestrator | ok: [testbed-node-2] => (item=cron) 2025-03-22 22:04:18.581505 | orchestrator | 2025-03-22 22:04:18.583247 | orchestrator | TASK [osism.commons.motd : Include distribution specific configure tasks] ****** 2025-03-22 22:04:18.583459 | orchestrator | Saturday 22 March 2025 22:04:18 +0000 (0:00:00.976) 0:03:41.095 ******** 2025-03-22 22:04:19.091739 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/motd/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:04:19.091889 | orchestrator | 2025-03-22 22:04:19.092082 | orchestrator | TASK [osism.commons.motd : Remove update-motd package] ************************* 2025-03-22 22:04:19.092380 | orchestrator | Saturday 22 March 2025 22:04:19 +0000 (0:00:00.512) 0:03:41.607 ******** 2025-03-22 22:04:20.278112 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:20.278290 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:20.279833 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:20.280447 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:20.282172 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:20.282627 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:20.282688 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:20.282710 | orchestrator | 2025-03-22 22:04:20.283815 | orchestrator | TASK [osism.commons.motd : Check if /etc/default/motd-news exists] ************* 2025-03-22 22:04:21.003092 | orchestrator | Saturday 22 March 2025 22:04:20 +0000 (0:00:01.185) 0:03:42.792 ******** 2025-03-22 22:04:21.003212 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:21.003408 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:21.003485 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:21.003739 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:21.004029 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:21.004478 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:21.004926 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:21.005705 | orchestrator | 2025-03-22 22:04:21.650175 | orchestrator | TASK [osism.commons.motd : Disable the dynamic motd-news service] ************** 2025-03-22 22:04:21.650397 | orchestrator | Saturday 22 March 2025 22:04:20 +0000 (0:00:00.724) 0:03:43.516 ******** 2025-03-22 22:04:21.650438 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:21.650531 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:21.651075 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:21.651116 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:21.652090 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:21.652434 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:21.652922 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:21.653541 | orchestrator | 2025-03-22 22:04:21.655049 | orchestrator | TASK [osism.commons.motd : Get all configuration files in /etc/pam.d] ********** 2025-03-22 22:04:21.656179 | orchestrator | Saturday 22 March 2025 22:04:21 +0000 (0:00:00.648) 0:03:44.165 ******** 2025-03-22 22:04:22.360823 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:22.361688 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:22.361782 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:22.362587 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:22.363767 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:22.363949 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:22.364805 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:22.365420 | orchestrator | 2025-03-22 22:04:22.366192 | orchestrator | TASK [osism.commons.motd : Remove pam_motd.so rule] **************************** 2025-03-22 22:04:22.366555 | orchestrator | Saturday 22 March 2025 22:04:22 +0000 (0:00:00.710) 0:03:44.875 ******** 2025-03-22 22:04:23.438897 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742679308.2255697, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.439404 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742679313.5253024, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.439477 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742679313.0036738, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.440043 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742679321.3033824, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.442948 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742679314.6290803, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.443405 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742679316.9058056, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.443451 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 591, 'dev': 2049, 'nlink': 1, 'atime': 1742679317.4474432, 'mtime': 1723170802.0, 'ctime': 1728031288.6324632, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.443538 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742679263.3124123, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.443996 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742679253.6542761, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.444592 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742679251.18014, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.445033 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742679256.3761017, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.445898 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742679331.4456208, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.446130 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742679260.4253469, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.446764 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 577, 'dev': 2049, 'nlink': 1, 'atime': 1742679253.8328266, 'mtime': 1712646062.0, 'ctime': 1728031288.6314633, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-03-22 22:04:23.447282 | orchestrator | 2025-03-22 22:04:23.447689 | orchestrator | TASK [osism.commons.motd : Copy motd file] ************************************* 2025-03-22 22:04:23.447965 | orchestrator | Saturday 22 March 2025 22:04:23 +0000 (0:00:01.076) 0:03:45.952 ******** 2025-03-22 22:04:24.728920 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:24.729592 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:24.729649 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:24.730180 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:24.730565 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:24.731136 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:24.731489 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:24.731906 | orchestrator | 2025-03-22 22:04:24.732715 | orchestrator | TASK [osism.commons.motd : Copy issue file] ************************************ 2025-03-22 22:04:25.942692 | orchestrator | Saturday 22 March 2025 22:04:24 +0000 (0:00:01.290) 0:03:47.243 ******** 2025-03-22 22:04:25.942815 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:25.943877 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:25.945632 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:25.947374 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:25.949138 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:25.949963 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:25.950929 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:25.951296 | orchestrator | 2025-03-22 22:04:25.952019 | orchestrator | TASK [osism.commons.motd : Configure SSH to print the motd] ******************** 2025-03-22 22:04:25.952921 | orchestrator | Saturday 22 March 2025 22:04:25 +0000 (0:00:01.211) 0:03:48.455 ******** 2025-03-22 22:04:26.032516 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:26.075408 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:04:26.121587 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:04:26.170075 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:04:26.219132 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:04:26.288192 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:04:26.288415 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:04:26.288540 | orchestrator | 2025-03-22 22:04:26.289079 | orchestrator | TASK [osism.commons.motd : Configure SSH to not print the motd] **************** 2025-03-22 22:04:26.289198 | orchestrator | Saturday 22 March 2025 22:04:26 +0000 (0:00:00.349) 0:03:48.804 ******** 2025-03-22 22:04:27.099910 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:27.102698 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:27.102895 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:27.102927 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:27.102942 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:27.102957 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:27.102976 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:27.103625 | orchestrator | 2025-03-22 22:04:27.103726 | orchestrator | TASK [osism.services.rng : Include distribution specific install tasks] ******** 2025-03-22 22:04:27.104756 | orchestrator | Saturday 22 March 2025 22:04:27 +0000 (0:00:00.807) 0:03:49.612 ******** 2025-03-22 22:04:27.591219 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rng/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:04:27.591828 | orchestrator | 2025-03-22 22:04:27.591881 | orchestrator | TASK [osism.services.rng : Install rng package] ******************************** 2025-03-22 22:04:27.592514 | orchestrator | Saturday 22 March 2025 22:04:27 +0000 (0:00:00.493) 0:03:50.105 ******** 2025-03-22 22:04:35.578227 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:35.578481 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:35.579077 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:35.579289 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:35.580117 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:35.581587 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:35.582406 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:35.582831 | orchestrator | 2025-03-22 22:04:35.583301 | orchestrator | TASK [osism.services.rng : Remove haveged package] ***************************** 2025-03-22 22:04:35.583564 | orchestrator | Saturday 22 March 2025 22:04:35 +0000 (0:00:07.988) 0:03:58.093 ******** 2025-03-22 22:04:36.908239 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:36.908471 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:36.908500 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:36.910411 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:36.911221 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:36.912441 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:36.913594 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:36.914532 | orchestrator | 2025-03-22 22:04:36.914646 | orchestrator | TASK [osism.services.rng : Manage rng service] ********************************* 2025-03-22 22:04:36.915181 | orchestrator | Saturday 22 March 2025 22:04:36 +0000 (0:00:01.325) 0:03:59.418 ******** 2025-03-22 22:04:37.979847 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:37.980309 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:37.980786 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:37.981523 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:37.981930 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:37.982397 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:37.985220 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:38.459315 | orchestrator | 2025-03-22 22:04:38.459468 | orchestrator | TASK [osism.services.smartd : Include distribution specific install tasks] ***** 2025-03-22 22:04:38.459489 | orchestrator | Saturday 22 March 2025 22:04:37 +0000 (0:00:01.074) 0:04:00.493 ******** 2025-03-22 22:04:38.459520 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/smartd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:04:38.459829 | orchestrator | 2025-03-22 22:04:38.460531 | orchestrator | TASK [osism.services.smartd : Install smartmontools package] ******************* 2025-03-22 22:04:38.460650 | orchestrator | Saturday 22 March 2025 22:04:38 +0000 (0:00:00.480) 0:04:00.974 ******** 2025-03-22 22:04:47.812564 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:47.812784 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:47.812812 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:47.812834 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:47.814007 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:47.815485 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:47.816587 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:47.816629 | orchestrator | 2025-03-22 22:04:47.817148 | orchestrator | TASK [osism.services.smartd : Create /var/log/smartd directory] **************** 2025-03-22 22:04:47.818188 | orchestrator | Saturday 22 March 2025 22:04:47 +0000 (0:00:09.350) 0:04:10.325 ******** 2025-03-22 22:04:48.471739 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:48.473298 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:48.474737 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:48.475420 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:48.476491 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:48.477182 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:48.478472 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:48.478567 | orchestrator | 2025-03-22 22:04:48.478591 | orchestrator | TASK [osism.services.smartd : Copy smartmontools configuration file] *********** 2025-03-22 22:04:48.479061 | orchestrator | Saturday 22 March 2025 22:04:48 +0000 (0:00:00.661) 0:04:10.986 ******** 2025-03-22 22:04:49.669324 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:49.671445 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:49.671484 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:49.671920 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:49.672856 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:49.673251 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:49.674161 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:49.674878 | orchestrator | 2025-03-22 22:04:49.675217 | orchestrator | TASK [osism.services.smartd : Manage smartd service] *************************** 2025-03-22 22:04:49.675678 | orchestrator | Saturday 22 March 2025 22:04:49 +0000 (0:00:01.195) 0:04:12.182 ******** 2025-03-22 22:04:50.983116 | orchestrator | changed: [testbed-manager] 2025-03-22 22:04:50.983580 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:04:50.984553 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:04:50.985275 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:04:50.987540 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:04:50.988202 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:04:50.989166 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:04:50.989741 | orchestrator | 2025-03-22 22:04:50.990543 | orchestrator | TASK [osism.commons.cleanup : Gather variables for each operating system] ****** 2025-03-22 22:04:50.991579 | orchestrator | Saturday 22 March 2025 22:04:50 +0000 (0:00:01.313) 0:04:13.495 ******** 2025-03-22 22:04:51.106899 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:51.159473 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:51.197936 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:51.234605 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:51.278633 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:51.359581 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:51.360804 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:51.360841 | orchestrator | 2025-03-22 22:04:51.361663 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_packages_distribution variable to default value] *** 2025-03-22 22:04:51.362396 | orchestrator | Saturday 22 March 2025 22:04:51 +0000 (0:00:00.379) 0:04:13.875 ******** 2025-03-22 22:04:51.496920 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:51.535198 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:51.573605 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:51.614699 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:51.732916 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:51.734484 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:51.736319 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:51.737173 | orchestrator | 2025-03-22 22:04:51.738705 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_services_distribution variable to default value] *** 2025-03-22 22:04:51.739740 | orchestrator | Saturday 22 March 2025 22:04:51 +0000 (0:00:00.372) 0:04:14.247 ******** 2025-03-22 22:04:51.854780 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:51.902369 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:51.941850 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:51.979409 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:52.063857 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:52.064416 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:52.065554 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:52.065641 | orchestrator | 2025-03-22 22:04:52.066740 | orchestrator | TASK [osism.commons.cleanup : Populate service facts] ************************** 2025-03-22 22:04:52.067698 | orchestrator | Saturday 22 March 2025 22:04:52 +0000 (0:00:00.330) 0:04:14.578 ******** 2025-03-22 22:04:56.773467 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:04:56.774821 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:04:56.775164 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:04:56.775471 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:04:56.775798 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:04:56.776116 | orchestrator | ok: [testbed-manager] 2025-03-22 22:04:56.777477 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:04:56.778065 | orchestrator | 2025-03-22 22:04:56.780664 | orchestrator | TASK [osism.commons.cleanup : Include distribution specific timer tasks] ******* 2025-03-22 22:04:56.781786 | orchestrator | Saturday 22 March 2025 22:04:56 +0000 (0:00:04.710) 0:04:19.288 ******** 2025-03-22 22:04:57.265726 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/timers-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:04:57.267683 | orchestrator | 2025-03-22 22:04:57.268170 | orchestrator | TASK [osism.commons.cleanup : Disable apt-daily timers] ************************ 2025-03-22 22:04:57.269395 | orchestrator | Saturday 22 March 2025 22:04:57 +0000 (0:00:00.490) 0:04:19.779 ******** 2025-03-22 22:04:57.369414 | orchestrator | skipping: [testbed-manager] => (item=apt-daily-upgrade)  2025-03-22 22:04:57.371812 | orchestrator | skipping: [testbed-manager] => (item=apt-daily)  2025-03-22 22:04:57.371845 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily-upgrade)  2025-03-22 22:04:57.415709 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily)  2025-03-22 22:04:57.415762 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:57.416789 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily-upgrade)  2025-03-22 22:04:57.472496 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:04:57.475014 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily)  2025-03-22 22:04:57.475383 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily-upgrade)  2025-03-22 22:04:57.475836 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily)  2025-03-22 22:04:57.516915 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:04:57.517521 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily-upgrade)  2025-03-22 22:04:57.573744 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily)  2025-03-22 22:04:57.574379 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:04:57.574846 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily-upgrade)  2025-03-22 22:04:57.575501 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily)  2025-03-22 22:04:57.671395 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:04:57.675135 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:04:57.676190 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily-upgrade)  2025-03-22 22:04:57.684082 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily)  2025-03-22 22:04:58.165020 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:04:58.165132 | orchestrator | 2025-03-22 22:04:58.165154 | orchestrator | TASK [osism.commons.cleanup : Include service tasks] *************************** 2025-03-22 22:04:58.165171 | orchestrator | Saturday 22 March 2025 22:04:57 +0000 (0:00:00.404) 0:04:20.184 ******** 2025-03-22 22:04:58.165203 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/services-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:04:58.165960 | orchestrator | 2025-03-22 22:04:58.166388 | orchestrator | TASK [osism.commons.cleanup : Cleanup services] ******************************** 2025-03-22 22:04:58.166729 | orchestrator | Saturday 22 March 2025 22:04:58 +0000 (0:00:00.495) 0:04:20.679 ******** 2025-03-22 22:04:58.252541 | orchestrator | skipping: [testbed-manager] => (item=ModemManager.service)  2025-03-22 22:04:58.305622 | orchestrator | skipping: [testbed-node-3] => (item=ModemManager.service)  2025-03-22 22:04:58.305681 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:04:58.361929 | orchestrator | skipping: [testbed-node-4] => (item=ModemManager.service)  2025-03-22 22:04:58.361971 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:04:58.362651 | orchestrator | skipping: [testbed-node-5] => (item=ModemManager.service)  2025-03-22 22:04:58.413509 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:04:58.414186 | orchestrator | skipping: [testbed-node-0] => (item=ModemManager.service)  2025-03-22 22:04:58.462358 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:04:58.463229 | orchestrator | skipping: [testbed-node-1] => (item=ModemManager.service)  2025-03-22 22:04:58.545217 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:04:58.547321 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:04:58.550621 | orchestrator | skipping: [testbed-node-2] => (item=ModemManager.service)  2025-03-22 22:04:58.551939 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:04:58.551969 | orchestrator | 2025-03-22 22:04:58.553722 | orchestrator | TASK [osism.commons.cleanup : Include packages tasks] ************************** 2025-03-22 22:04:58.554315 | orchestrator | Saturday 22 March 2025 22:04:58 +0000 (0:00:00.382) 0:04:21.061 ******** 2025-03-22 22:04:59.043230 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/packages-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:04:59.043945 | orchestrator | 2025-03-22 22:04:59.044476 | orchestrator | TASK [osism.commons.cleanup : Cleanup installed packages] ********************** 2025-03-22 22:04:59.045517 | orchestrator | Saturday 22 March 2025 22:04:59 +0000 (0:00:00.495) 0:04:21.557 ******** 2025-03-22 22:05:30.416475 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:05:30.416679 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:05:30.416706 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:05:30.416721 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:05:30.416736 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:05:30.416751 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:05:30.416771 | orchestrator | changed: [testbed-manager] 2025-03-22 22:05:30.416967 | orchestrator | 2025-03-22 22:05:30.418377 | orchestrator | TASK [osism.commons.cleanup : Remove cloudinit package] ************************ 2025-03-22 22:05:30.418763 | orchestrator | Saturday 22 March 2025 22:05:30 +0000 (0:00:31.369) 0:04:52.927 ******** 2025-03-22 22:05:38.465201 | orchestrator | changed: [testbed-manager] 2025-03-22 22:05:38.465500 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:05:38.467211 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:05:38.470559 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:05:38.471246 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:05:38.471279 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:05:38.471984 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:05:38.472664 | orchestrator | 2025-03-22 22:05:38.473193 | orchestrator | TASK [osism.commons.cleanup : Uninstall unattended-upgrades package] *********** 2025-03-22 22:05:38.473760 | orchestrator | Saturday 22 March 2025 22:05:38 +0000 (0:00:08.052) 0:05:00.979 ******** 2025-03-22 22:05:46.144750 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:05:46.144948 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:05:46.144977 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:05:46.145345 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:05:46.145661 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:05:46.146533 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:05:46.147204 | orchestrator | changed: [testbed-manager] 2025-03-22 22:05:46.148744 | orchestrator | 2025-03-22 22:05:46.150082 | orchestrator | TASK [osism.commons.cleanup : Remove useless packages from the cache] ********** 2025-03-22 22:05:46.150748 | orchestrator | Saturday 22 March 2025 22:05:46 +0000 (0:00:07.679) 0:05:08.659 ******** 2025-03-22 22:05:47.919938 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:05:47.920138 | orchestrator | ok: [testbed-manager] 2025-03-22 22:05:47.923516 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:05:47.923899 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:05:47.923930 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:05:47.924604 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:05:47.924953 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:05:47.925739 | orchestrator | 2025-03-22 22:05:47.926437 | orchestrator | TASK [osism.commons.cleanup : Remove dependencies that are no longer required] *** 2025-03-22 22:05:47.926756 | orchestrator | Saturday 22 March 2025 22:05:47 +0000 (0:00:01.773) 0:05:10.432 ******** 2025-03-22 22:05:54.081358 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:05:54.081514 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:05:54.081831 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:05:54.082960 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:05:54.083903 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:05:54.084862 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:05:54.085531 | orchestrator | changed: [testbed-manager] 2025-03-22 22:05:54.086501 | orchestrator | 2025-03-22 22:05:54.087152 | orchestrator | TASK [osism.commons.cleanup : Include cloudinit tasks] ************************* 2025-03-22 22:05:54.087826 | orchestrator | Saturday 22 March 2025 22:05:54 +0000 (0:00:06.162) 0:05:16.595 ******** 2025-03-22 22:05:54.632543 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/cloudinit.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:05:54.633263 | orchestrator | 2025-03-22 22:05:54.634504 | orchestrator | TASK [osism.commons.cleanup : Remove cloud-init configuration directory] ******* 2025-03-22 22:05:54.634701 | orchestrator | Saturday 22 March 2025 22:05:54 +0000 (0:00:00.552) 0:05:17.147 ******** 2025-03-22 22:05:55.447039 | orchestrator | changed: [testbed-manager] 2025-03-22 22:05:55.447223 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:05:55.447783 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:05:55.448275 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:05:55.448536 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:05:55.448988 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:05:55.452531 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:05:57.150103 | orchestrator | 2025-03-22 22:05:57.150214 | orchestrator | TASK [osism.commons.timezone : Install tzdata package] ************************* 2025-03-22 22:05:57.150233 | orchestrator | Saturday 22 March 2025 22:05:55 +0000 (0:00:00.812) 0:05:17.960 ******** 2025-03-22 22:05:57.150263 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:05:57.151104 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:05:57.152732 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:05:57.153611 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:05:57.158668 | orchestrator | ok: [testbed-manager] 2025-03-22 22:05:57.159941 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:05:57.161749 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:05:57.162387 | orchestrator | 2025-03-22 22:05:57.163527 | orchestrator | TASK [osism.commons.timezone : Set timezone to UTC] **************************** 2025-03-22 22:05:57.164009 | orchestrator | Saturday 22 March 2025 22:05:57 +0000 (0:00:01.703) 0:05:19.663 ******** 2025-03-22 22:05:58.028797 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:05:58.029380 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:05:58.029419 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:05:58.030180 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:05:58.030870 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:05:58.032285 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:05:58.032719 | orchestrator | changed: [testbed-manager] 2025-03-22 22:05:58.032749 | orchestrator | 2025-03-22 22:05:58.035042 | orchestrator | TASK [osism.commons.timezone : Create /etc/adjtime file] *********************** 2025-03-22 22:05:58.035977 | orchestrator | Saturday 22 March 2025 22:05:58 +0000 (0:00:00.880) 0:05:20.543 ******** 2025-03-22 22:05:58.106985 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:05:58.141974 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:05:58.229819 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:05:58.274209 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:05:58.341631 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:05:58.343046 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:05:58.343889 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:05:58.344600 | orchestrator | 2025-03-22 22:05:58.345516 | orchestrator | TASK [osism.commons.timezone : Ensure UTC in /etc/adjtime] ********************* 2025-03-22 22:05:58.346090 | orchestrator | Saturday 22 March 2025 22:05:58 +0000 (0:00:00.312) 0:05:20.855 ******** 2025-03-22 22:05:58.471964 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:05:58.513525 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:05:58.554370 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:05:58.593770 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:05:58.816991 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:05:58.817178 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:05:58.820476 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:05:58.958517 | orchestrator | 2025-03-22 22:05:58.958551 | orchestrator | TASK [osism.services.docker : Gather variables for each operating system] ****** 2025-03-22 22:05:58.958564 | orchestrator | Saturday 22 March 2025 22:05:58 +0000 (0:00:00.476) 0:05:21.332 ******** 2025-03-22 22:05:58.958581 | orchestrator | ok: [testbed-manager] 2025-03-22 22:05:58.993810 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:05:59.042881 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:05:59.091466 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:05:59.174134 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:05:59.175185 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:05:59.175213 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:05:59.176502 | orchestrator | 2025-03-22 22:05:59.176831 | orchestrator | TASK [osism.services.docker : Set docker_version variable to default value] **** 2025-03-22 22:05:59.177382 | orchestrator | Saturday 22 March 2025 22:05:59 +0000 (0:00:00.357) 0:05:21.689 ******** 2025-03-22 22:05:59.277287 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:05:59.330632 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:05:59.414675 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:05:59.476236 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:05:59.537798 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:05:59.608818 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:05:59.610715 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:05:59.613342 | orchestrator | 2025-03-22 22:05:59.614274 | orchestrator | TASK [osism.services.docker : Set docker_cli_version variable to default value] *** 2025-03-22 22:05:59.615500 | orchestrator | Saturday 22 March 2025 22:05:59 +0000 (0:00:00.433) 0:05:22.122 ******** 2025-03-22 22:05:59.688985 | orchestrator | ok: [testbed-manager] 2025-03-22 22:05:59.786239 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:05:59.848446 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:05:59.871113 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:05:59.961857 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:05:59.963155 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:05:59.964914 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:05:59.966611 | orchestrator | 2025-03-22 22:05:59.966827 | orchestrator | TASK [osism.services.docker : Include block storage tasks] ********************* 2025-03-22 22:05:59.969127 | orchestrator | Saturday 22 March 2025 22:05:59 +0000 (0:00:00.355) 0:05:22.478 ******** 2025-03-22 22:06:00.058247 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:06:00.100172 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:06:00.145072 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:06:00.181070 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:06:00.222104 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:06:00.300674 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:06:00.301543 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:06:00.302251 | orchestrator | 2025-03-22 22:06:00.303192 | orchestrator | TASK [osism.services.docker : Include zram storage tasks] ********************** 2025-03-22 22:06:00.303748 | orchestrator | Saturday 22 March 2025 22:06:00 +0000 (0:00:00.336) 0:05:22.815 ******** 2025-03-22 22:06:00.379533 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:06:00.422814 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:06:00.461739 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:06:00.494478 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:06:00.532303 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:06:00.616105 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:06:00.617188 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:06:00.617216 | orchestrator | 2025-03-22 22:06:00.617928 | orchestrator | TASK [osism.services.docker : Include docker install tasks] ******************** 2025-03-22 22:06:00.618762 | orchestrator | Saturday 22 March 2025 22:06:00 +0000 (0:00:00.316) 0:05:23.131 ******** 2025-03-22 22:06:01.254347 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/install-docker-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:06:01.254521 | orchestrator | 2025-03-22 22:06:01.255876 | orchestrator | TASK [osism.services.docker : Remove old architecture-dependent repository] **** 2025-03-22 22:06:01.258486 | orchestrator | Saturday 22 March 2025 22:06:01 +0000 (0:00:00.637) 0:05:23.769 ******** 2025-03-22 22:06:02.258545 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:02.259027 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:06:02.260042 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:06:02.260806 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:06:02.261585 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:06:02.262120 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:06:02.262671 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:06:02.263136 | orchestrator | 2025-03-22 22:06:02.264172 | orchestrator | TASK [osism.services.docker : Gather package facts] **************************** 2025-03-22 22:06:02.264367 | orchestrator | Saturday 22 March 2025 22:06:02 +0000 (0:00:01.004) 0:05:24.773 ******** 2025-03-22 22:06:05.395984 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:06:05.397245 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:06:05.397659 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:06:05.398931 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:06:05.399864 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:06:05.400225 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:06:05.401862 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:05.402125 | orchestrator | 2025-03-22 22:06:05.403127 | orchestrator | TASK [osism.services.docker : Check whether packages are installed that should not be installed] *** 2025-03-22 22:06:05.403599 | orchestrator | Saturday 22 March 2025 22:06:05 +0000 (0:00:03.136) 0:05:27.910 ******** 2025-03-22 22:06:05.501503 | orchestrator | skipping: [testbed-manager] => (item=containerd)  2025-03-22 22:06:05.502604 | orchestrator | skipping: [testbed-manager] => (item=docker.io)  2025-03-22 22:06:05.502809 | orchestrator | skipping: [testbed-manager] => (item=docker-engine)  2025-03-22 22:06:05.586130 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:06:05.586476 | orchestrator | skipping: [testbed-node-3] => (item=containerd)  2025-03-22 22:06:05.586897 | orchestrator | skipping: [testbed-node-3] => (item=docker.io)  2025-03-22 22:06:05.587256 | orchestrator | skipping: [testbed-node-3] => (item=docker-engine)  2025-03-22 22:06:05.663821 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:06:05.664866 | orchestrator | skipping: [testbed-node-4] => (item=containerd)  2025-03-22 22:06:05.665208 | orchestrator | skipping: [testbed-node-4] => (item=docker.io)  2025-03-22 22:06:05.666532 | orchestrator | skipping: [testbed-node-4] => (item=docker-engine)  2025-03-22 22:06:05.739360 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:06:05.740883 | orchestrator | skipping: [testbed-node-5] => (item=containerd)  2025-03-22 22:06:05.742536 | orchestrator | skipping: [testbed-node-5] => (item=docker.io)  2025-03-22 22:06:05.743259 | orchestrator | skipping: [testbed-node-5] => (item=docker-engine)  2025-03-22 22:06:05.834628 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:06:05.834985 | orchestrator | skipping: [testbed-node-0] => (item=containerd)  2025-03-22 22:06:05.835661 | orchestrator | skipping: [testbed-node-0] => (item=docker.io)  2025-03-22 22:06:05.836151 | orchestrator | skipping: [testbed-node-0] => (item=docker-engine)  2025-03-22 22:06:05.910533 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:06:05.911093 | orchestrator | skipping: [testbed-node-1] => (item=containerd)  2025-03-22 22:06:05.911850 | orchestrator | skipping: [testbed-node-1] => (item=docker.io)  2025-03-22 22:06:05.913088 | orchestrator | skipping: [testbed-node-1] => (item=docker-engine)  2025-03-22 22:06:06.053525 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:06:06.054364 | orchestrator | skipping: [testbed-node-2] => (item=containerd)  2025-03-22 22:06:06.054625 | orchestrator | skipping: [testbed-node-2] => (item=docker.io)  2025-03-22 22:06:06.055549 | orchestrator | skipping: [testbed-node-2] => (item=docker-engine)  2025-03-22 22:06:06.056169 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:06:06.056776 | orchestrator | 2025-03-22 22:06:06.057379 | orchestrator | TASK [osism.services.docker : Install apt-transport-https package] ************* 2025-03-22 22:06:12.685228 | orchestrator | Saturday 22 March 2025 22:06:06 +0000 (0:00:00.657) 0:05:28.568 ******** 2025-03-22 22:06:12.685394 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:12.685469 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:12.685494 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:12.685643 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:12.686224 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:12.686690 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:12.693422 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:13.883176 | orchestrator | 2025-03-22 22:06:13.883280 | orchestrator | TASK [osism.services.docker : Add repository gpg key] ************************** 2025-03-22 22:06:13.883295 | orchestrator | Saturday 22 March 2025 22:06:12 +0000 (0:00:06.629) 0:05:35.197 ******** 2025-03-22 22:06:13.883365 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:13.883942 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:13.883965 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:13.883984 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:13.887286 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:13.887429 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:13.887449 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:13.887461 | orchestrator | 2025-03-22 22:06:13.887473 | orchestrator | TASK [osism.services.docker : Add repository] ********************************** 2025-03-22 22:06:13.887489 | orchestrator | Saturday 22 March 2025 22:06:13 +0000 (0:00:01.200) 0:05:36.397 ******** 2025-03-22 22:06:22.028782 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:22.029023 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:22.029983 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:22.033646 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:22.034546 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:22.034582 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:22.035429 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:22.036345 | orchestrator | 2025-03-22 22:06:22.038637 | orchestrator | TASK [osism.services.docker : Update package cache] **************************** 2025-03-22 22:06:22.040024 | orchestrator | Saturday 22 March 2025 22:06:22 +0000 (0:00:08.143) 0:05:44.541 ******** 2025-03-22 22:06:25.309391 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:25.309767 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:25.310684 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:25.311843 | orchestrator | changed: [testbed-manager] 2025-03-22 22:06:25.313070 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:25.313273 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:25.314724 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:25.316110 | orchestrator | 2025-03-22 22:06:25.317053 | orchestrator | TASK [osism.services.docker : Pin docker package version] ********************** 2025-03-22 22:06:25.318117 | orchestrator | Saturday 22 March 2025 22:06:25 +0000 (0:00:03.281) 0:05:47.822 ******** 2025-03-22 22:06:27.132699 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:27.137182 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:27.138180 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:27.139047 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:27.140203 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:27.140251 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:27.142958 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:27.143558 | orchestrator | 2025-03-22 22:06:27.144068 | orchestrator | TASK [osism.services.docker : Pin docker-cli package version] ****************** 2025-03-22 22:06:27.144480 | orchestrator | Saturday 22 March 2025 22:06:27 +0000 (0:00:01.821) 0:05:49.643 ******** 2025-03-22 22:06:28.555136 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:28.555501 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:28.556616 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:28.557376 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:28.557952 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:28.560234 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:28.560462 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:28.561105 | orchestrator | 2025-03-22 22:06:28.561834 | orchestrator | TASK [osism.services.docker : Unlock containerd package] *********************** 2025-03-22 22:06:28.562494 | orchestrator | Saturday 22 March 2025 22:06:28 +0000 (0:00:01.420) 0:05:51.064 ******** 2025-03-22 22:06:28.796441 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:06:28.884496 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:06:28.967777 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:06:29.053513 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:06:29.270747 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:06:29.270864 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:06:29.271962 | orchestrator | changed: [testbed-manager] 2025-03-22 22:06:29.275278 | orchestrator | 2025-03-22 22:06:39.438398 | orchestrator | TASK [osism.services.docker : Install containerd package] ********************** 2025-03-22 22:06:39.438522 | orchestrator | Saturday 22 March 2025 22:06:29 +0000 (0:00:00.718) 0:05:51.782 ******** 2025-03-22 22:06:39.438556 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:39.440025 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:39.440060 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:39.440464 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:39.440837 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:39.442707 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:39.442881 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:39.443454 | orchestrator | 2025-03-22 22:06:39.443622 | orchestrator | TASK [osism.services.docker : Lock containerd package] ************************* 2025-03-22 22:06:39.443868 | orchestrator | Saturday 22 March 2025 22:06:39 +0000 (0:00:10.171) 0:06:01.953 ******** 2025-03-22 22:06:40.442697 | orchestrator | changed: [testbed-manager] 2025-03-22 22:06:40.443370 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:40.449073 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:40.460582 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:53.499342 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:53.499481 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:53.499503 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:53.499551 | orchestrator | 2025-03-22 22:06:53.499569 | orchestrator | TASK [osism.services.docker : Install docker-cli package] ********************** 2025-03-22 22:06:53.499585 | orchestrator | Saturday 22 March 2025 22:06:40 +0000 (0:00:00.995) 0:06:02.949 ******** 2025-03-22 22:06:53.499617 | orchestrator | ok: [testbed-manager] 2025-03-22 22:06:53.499690 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:06:53.499714 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:06:53.500940 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:06:53.501667 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:06:53.502550 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:06:53.502879 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:06:53.504349 | orchestrator | 2025-03-22 22:06:53.504721 | orchestrator | TASK [osism.services.docker : Install docker package] ************************** 2025-03-22 22:06:53.505370 | orchestrator | Saturday 22 March 2025 22:06:53 +0000 (0:00:13.061) 0:06:16.010 ******** 2025-03-22 22:07:06.405079 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:06.405505 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:06.405546 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:06.405562 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:06.405586 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:06.405958 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:06.406630 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:06.409022 | orchestrator | 2025-03-22 22:07:06.828773 | orchestrator | TASK [osism.services.docker : Unblock installation of python docker packages] *** 2025-03-22 22:07:06.828879 | orchestrator | Saturday 22 March 2025 22:07:06 +0000 (0:00:12.908) 0:06:28.919 ******** 2025-03-22 22:07:06.828913 | orchestrator | ok: [testbed-manager] => (item=python3-docker) 2025-03-22 22:07:06.931768 | orchestrator | ok: [testbed-node-3] => (item=python3-docker) 2025-03-22 22:07:07.727215 | orchestrator | ok: [testbed-node-4] => (item=python3-docker) 2025-03-22 22:07:07.727568 | orchestrator | ok: [testbed-node-5] => (item=python3-docker) 2025-03-22 22:07:07.727685 | orchestrator | ok: [testbed-node-0] => (item=python3-docker) 2025-03-22 22:07:07.728641 | orchestrator | ok: [testbed-node-1] => (item=python3-docker) 2025-03-22 22:07:07.728776 | orchestrator | ok: [testbed-manager] => (item=python-docker) 2025-03-22 22:07:07.729186 | orchestrator | ok: [testbed-node-3] => (item=python-docker) 2025-03-22 22:07:07.729510 | orchestrator | ok: [testbed-node-4] => (item=python-docker) 2025-03-22 22:07:07.729959 | orchestrator | ok: [testbed-node-2] => (item=python3-docker) 2025-03-22 22:07:07.730434 | orchestrator | ok: [testbed-node-5] => (item=python-docker) 2025-03-22 22:07:07.733322 | orchestrator | ok: [testbed-node-0] => (item=python-docker) 2025-03-22 22:07:07.733393 | orchestrator | ok: [testbed-node-1] => (item=python-docker) 2025-03-22 22:07:07.880992 | orchestrator | ok: [testbed-node-2] => (item=python-docker) 2025-03-22 22:07:07.881034 | orchestrator | 2025-03-22 22:07:07.881050 | orchestrator | TASK [osism.services.docker : Install python3 docker package] ****************** 2025-03-22 22:07:07.881066 | orchestrator | Saturday 22 March 2025 22:07:07 +0000 (0:00:01.321) 0:06:30.240 ******** 2025-03-22 22:07:07.881088 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:07.952563 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:08.033827 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:08.121396 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:08.192611 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:08.342098 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:08.342199 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:08.343809 | orchestrator | 2025-03-22 22:07:12.531991 | orchestrator | TASK [osism.services.docker : Install python3 docker package from Debian Sid] *** 2025-03-22 22:07:12.532110 | orchestrator | Saturday 22 March 2025 22:07:08 +0000 (0:00:00.613) 0:06:30.853 ******** 2025-03-22 22:07:12.532143 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:12.532217 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:12.533468 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:12.534497 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:12.535519 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:12.535705 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:12.536245 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:12.537055 | orchestrator | 2025-03-22 22:07:12.537325 | orchestrator | TASK [osism.services.docker : Remove python docker packages (install python bindings from pip)] *** 2025-03-22 22:07:12.540485 | orchestrator | Saturday 22 March 2025 22:07:12 +0000 (0:00:04.192) 0:06:35.046 ******** 2025-03-22 22:07:12.679859 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:12.966198 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:13.051914 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:13.159398 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:13.250159 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:13.356078 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:13.356719 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:13.357163 | orchestrator | 2025-03-22 22:07:13.357648 | orchestrator | TASK [osism.services.docker : Block installation of python docker packages (install python bindings from pip)] *** 2025-03-22 22:07:13.358359 | orchestrator | Saturday 22 March 2025 22:07:13 +0000 (0:00:00.824) 0:06:35.871 ******** 2025-03-22 22:07:13.444163 | orchestrator | skipping: [testbed-manager] => (item=python3-docker)  2025-03-22 22:07:13.444791 | orchestrator | skipping: [testbed-manager] => (item=python-docker)  2025-03-22 22:07:13.543043 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:13.544080 | orchestrator | skipping: [testbed-node-3] => (item=python3-docker)  2025-03-22 22:07:13.544693 | orchestrator | skipping: [testbed-node-3] => (item=python-docker)  2025-03-22 22:07:13.638868 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:13.639198 | orchestrator | skipping: [testbed-node-4] => (item=python3-docker)  2025-03-22 22:07:13.639236 | orchestrator | skipping: [testbed-node-4] => (item=python-docker)  2025-03-22 22:07:13.723392 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:13.724010 | orchestrator | skipping: [testbed-node-5] => (item=python3-docker)  2025-03-22 22:07:13.725405 | orchestrator | skipping: [testbed-node-5] => (item=python-docker)  2025-03-22 22:07:13.819562 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:13.819714 | orchestrator | skipping: [testbed-node-0] => (item=python3-docker)  2025-03-22 22:07:13.820426 | orchestrator | skipping: [testbed-node-0] => (item=python-docker)  2025-03-22 22:07:13.899134 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:13.899379 | orchestrator | skipping: [testbed-node-1] => (item=python3-docker)  2025-03-22 22:07:13.899885 | orchestrator | skipping: [testbed-node-1] => (item=python-docker)  2025-03-22 22:07:14.031894 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:14.032610 | orchestrator | skipping: [testbed-node-2] => (item=python3-docker)  2025-03-22 22:07:14.033095 | orchestrator | skipping: [testbed-node-2] => (item=python-docker)  2025-03-22 22:07:14.033990 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:14.034412 | orchestrator | 2025-03-22 22:07:14.034968 | orchestrator | TASK [osism.services.docker : Install python3-pip package (install python bindings from pip)] *** 2025-03-22 22:07:14.035309 | orchestrator | Saturday 22 March 2025 22:07:14 +0000 (0:00:00.674) 0:06:36.545 ******** 2025-03-22 22:07:14.217106 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:14.301346 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:14.383484 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:14.458861 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:14.531211 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:14.654771 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:14.655270 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:14.655587 | orchestrator | 2025-03-22 22:07:14.656212 | orchestrator | TASK [osism.services.docker : Install docker packages (install python bindings from pip)] *** 2025-03-22 22:07:14.657001 | orchestrator | Saturday 22 March 2025 22:07:14 +0000 (0:00:00.624) 0:06:37.169 ******** 2025-03-22 22:07:14.803831 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:14.905702 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:15.014798 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:15.095818 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:15.197270 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:15.314061 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:15.314756 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:15.315744 | orchestrator | 2025-03-22 22:07:15.316359 | orchestrator | TASK [osism.services.docker : Install packages required by docker login] ******* 2025-03-22 22:07:15.317346 | orchestrator | Saturday 22 March 2025 22:07:15 +0000 (0:00:00.656) 0:06:37.826 ******** 2025-03-22 22:07:15.486083 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:15.566689 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:15.647391 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:15.718195 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:15.797731 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:15.930757 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:15.933732 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:15.938258 | orchestrator | 2025-03-22 22:07:15.939125 | orchestrator | TASK [osism.services.docker : Ensure that some packages are not installed] ***** 2025-03-22 22:07:15.939339 | orchestrator | Saturday 22 March 2025 22:07:15 +0000 (0:00:00.616) 0:06:38.442 ******** 2025-03-22 22:07:22.958684 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:22.958905 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:22.959693 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:22.960480 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:22.961428 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:22.962243 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:22.963995 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:22.964598 | orchestrator | 2025-03-22 22:07:22.965314 | orchestrator | TASK [osism.services.docker : Include config tasks] **************************** 2025-03-22 22:07:22.965678 | orchestrator | Saturday 22 March 2025 22:07:22 +0000 (0:00:07.029) 0:06:45.471 ******** 2025-03-22 22:07:23.951520 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/config.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:07:23.952049 | orchestrator | 2025-03-22 22:07:23.952586 | orchestrator | TASK [osism.services.docker : Create plugins directory] ************************ 2025-03-22 22:07:23.953754 | orchestrator | Saturday 22 March 2025 22:07:23 +0000 (0:00:00.996) 0:06:46.467 ******** 2025-03-22 22:07:24.488187 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:24.944911 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:24.945833 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:24.946971 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:24.948375 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:24.949431 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:24.951557 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:24.952729 | orchestrator | 2025-03-22 22:07:24.952763 | orchestrator | TASK [osism.services.docker : Create systemd overlay directory] **************** 2025-03-22 22:07:24.953372 | orchestrator | Saturday 22 March 2025 22:07:24 +0000 (0:00:00.992) 0:06:47.460 ******** 2025-03-22 22:07:26.178709 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:26.179145 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:26.179857 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:26.181737 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:26.183820 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:26.184954 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:26.185790 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:26.186739 | orchestrator | 2025-03-22 22:07:26.186965 | orchestrator | TASK [osism.services.docker : Copy systemd overlay file] *********************** 2025-03-22 22:07:26.187879 | orchestrator | Saturday 22 March 2025 22:07:26 +0000 (0:00:01.231) 0:06:48.691 ******** 2025-03-22 22:07:27.686452 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:27.686611 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:27.686666 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:27.686690 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:27.687342 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:27.687766 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:27.688591 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:27.689053 | orchestrator | 2025-03-22 22:07:27.689658 | orchestrator | TASK [osism.services.docker : Reload systemd daemon if systemd overlay file is changed] *** 2025-03-22 22:07:27.690169 | orchestrator | Saturday 22 March 2025 22:07:27 +0000 (0:00:01.507) 0:06:50.198 ******** 2025-03-22 22:07:27.845253 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:29.341791 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:29.342427 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:29.342949 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:29.343905 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:29.344791 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:29.345173 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:29.348395 | orchestrator | 2025-03-22 22:07:30.814005 | orchestrator | TASK [osism.services.docker : Copy limits configuration file] ****************** 2025-03-22 22:07:30.814147 | orchestrator | Saturday 22 March 2025 22:07:29 +0000 (0:00:01.658) 0:06:51.857 ******** 2025-03-22 22:07:30.814177 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:30.814237 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:30.815982 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:30.817014 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:30.818604 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:30.819778 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:30.820369 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:30.822101 | orchestrator | 2025-03-22 22:07:30.823002 | orchestrator | TASK [osism.services.docker : Copy daemon.json configuration file] ************* 2025-03-22 22:07:30.825570 | orchestrator | Saturday 22 March 2025 22:07:30 +0000 (0:00:01.466) 0:06:53.323 ******** 2025-03-22 22:07:32.400408 | orchestrator | changed: [testbed-manager] 2025-03-22 22:07:32.400772 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:32.403298 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:32.404384 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:32.405167 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:32.407299 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:32.407985 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:32.408906 | orchestrator | 2025-03-22 22:07:32.409748 | orchestrator | TASK [osism.services.docker : Include service tasks] *************************** 2025-03-22 22:07:32.410243 | orchestrator | Saturday 22 March 2025 22:07:32 +0000 (0:00:01.589) 0:06:54.913 ******** 2025-03-22 22:07:33.633197 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/service.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:07:33.633412 | orchestrator | 2025-03-22 22:07:33.633739 | orchestrator | TASK [osism.services.docker : Reload systemd daemon] *************************** 2025-03-22 22:07:33.634205 | orchestrator | Saturday 22 March 2025 22:07:33 +0000 (0:00:01.233) 0:06:56.146 ******** 2025-03-22 22:07:35.322737 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:35.323231 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:35.323716 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:35.324877 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:35.325387 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:35.327302 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:36.612465 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:36.612577 | orchestrator | 2025-03-22 22:07:36.612596 | orchestrator | TASK [osism.services.docker : Manage service] ********************************** 2025-03-22 22:07:36.612612 | orchestrator | Saturday 22 March 2025 22:07:35 +0000 (0:00:01.687) 0:06:57.833 ******** 2025-03-22 22:07:36.612641 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:36.613664 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:36.613699 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:36.615640 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:36.616589 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:36.617388 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:36.618511 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:36.619112 | orchestrator | 2025-03-22 22:07:36.620253 | orchestrator | TASK [osism.services.docker : Manage docker socket service] ******************** 2025-03-22 22:07:36.621361 | orchestrator | Saturday 22 March 2025 22:07:36 +0000 (0:00:01.291) 0:06:59.125 ******** 2025-03-22 22:07:37.913791 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:37.917787 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:37.917830 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:39.479127 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:39.479242 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:39.479259 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:39.479355 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:39.479373 | orchestrator | 2025-03-22 22:07:39.479390 | orchestrator | TASK [osism.services.docker : Manage containerd service] *********************** 2025-03-22 22:07:39.479406 | orchestrator | Saturday 22 March 2025 22:07:37 +0000 (0:00:01.297) 0:07:00.422 ******** 2025-03-22 22:07:39.479437 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:39.479510 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:39.479528 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:39.479546 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:39.479898 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:39.480346 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:39.480445 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:39.480971 | orchestrator | 2025-03-22 22:07:39.481710 | orchestrator | TASK [osism.services.docker : Include bootstrap tasks] ************************* 2025-03-22 22:07:40.976813 | orchestrator | Saturday 22 March 2025 22:07:39 +0000 (0:00:01.571) 0:07:01.994 ******** 2025-03-22 22:07:40.976914 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/bootstrap.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:07:40.977078 | orchestrator | 2025-03-22 22:07:40.978092 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-22 22:07:40.981747 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:01.141) 0:07:03.135 ******** 2025-03-22 22:07:40.982394 | orchestrator | 2025-03-22 22:07:40.983075 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-22 22:07:40.983616 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:00.042) 0:07:03.178 ******** 2025-03-22 22:07:40.984690 | orchestrator | 2025-03-22 22:07:40.985628 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-22 22:07:40.985695 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:00.055) 0:07:03.234 ******** 2025-03-22 22:07:40.986661 | orchestrator | 2025-03-22 22:07:40.987592 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-22 22:07:40.989399 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:00.057) 0:07:03.292 ******** 2025-03-22 22:07:40.990679 | orchestrator | 2025-03-22 22:07:40.991227 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-22 22:07:40.992267 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:00.051) 0:07:03.343 ******** 2025-03-22 22:07:40.993094 | orchestrator | 2025-03-22 22:07:40.993847 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-22 22:07:40.994603 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:00.041) 0:07:03.385 ******** 2025-03-22 22:07:40.995335 | orchestrator | 2025-03-22 22:07:40.996385 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-03-22 22:07:40.996674 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:00.053) 0:07:03.439 ******** 2025-03-22 22:07:40.997398 | orchestrator | 2025-03-22 22:07:40.998648 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-03-22 22:07:40.999390 | orchestrator | Saturday 22 March 2025 22:07:40 +0000 (0:00:00.049) 0:07:03.489 ******** 2025-03-22 22:07:42.214860 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:42.215074 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:42.215490 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:42.216249 | orchestrator | 2025-03-22 22:07:42.216653 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart rsyslog service] ************* 2025-03-22 22:07:42.217449 | orchestrator | Saturday 22 March 2025 22:07:42 +0000 (0:00:01.237) 0:07:04.727 ******** 2025-03-22 22:07:44.058225 | orchestrator | changed: [testbed-manager] 2025-03-22 22:07:44.058480 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:44.059143 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:44.060009 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:44.060047 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:44.060643 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:44.061585 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:44.062355 | orchestrator | 2025-03-22 22:07:44.063606 | orchestrator | RUNNING HANDLER [osism.services.smartd : Restart smartd service] *************** 2025-03-22 22:07:44.066420 | orchestrator | Saturday 22 March 2025 22:07:44 +0000 (0:00:01.842) 0:07:06.569 ******** 2025-03-22 22:07:45.334196 | orchestrator | changed: [testbed-manager] 2025-03-22 22:07:45.335404 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:45.335523 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:45.336470 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:45.337401 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:45.338466 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:45.339239 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:45.341107 | orchestrator | 2025-03-22 22:07:45.341545 | orchestrator | RUNNING HANDLER [osism.services.docker : Restart docker service] *************** 2025-03-22 22:07:45.342503 | orchestrator | Saturday 22 March 2025 22:07:45 +0000 (0:00:01.278) 0:07:07.848 ******** 2025-03-22 22:07:45.512118 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:47.451175 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:47.452583 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:47.452946 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:47.453987 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:47.455667 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:47.456146 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:47.457081 | orchestrator | 2025-03-22 22:07:47.457520 | orchestrator | RUNNING HANDLER [osism.services.docker : Wait after docker service restart] **** 2025-03-22 22:07:47.458548 | orchestrator | Saturday 22 March 2025 22:07:47 +0000 (0:00:02.112) 0:07:09.960 ******** 2025-03-22 22:07:47.567943 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:47.568332 | orchestrator | 2025-03-22 22:07:47.568373 | orchestrator | TASK [osism.services.docker : Add user to docker group] ************************ 2025-03-22 22:07:47.568395 | orchestrator | Saturday 22 March 2025 22:07:47 +0000 (0:00:00.122) 0:07:10.082 ******** 2025-03-22 22:07:48.882825 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:07:48.882990 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:07:48.883552 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:48.883818 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:07:48.884770 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:07:48.886427 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:07:48.887429 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:07:48.888501 | orchestrator | 2025-03-22 22:07:48.888745 | orchestrator | TASK [osism.services.docker : Log into private registry and force re-authorization] *** 2025-03-22 22:07:48.889192 | orchestrator | Saturday 22 March 2025 22:07:48 +0000 (0:00:01.312) 0:07:11.395 ******** 2025-03-22 22:07:49.073091 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:49.155869 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:49.255745 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:49.574761 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:49.658773 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:49.785505 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:49.785633 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:49.786728 | orchestrator | 2025-03-22 22:07:49.789050 | orchestrator | TASK [osism.services.docker : Include facts tasks] ***************************** 2025-03-22 22:07:49.792775 | orchestrator | Saturday 22 March 2025 22:07:49 +0000 (0:00:00.904) 0:07:12.299 ******** 2025-03-22 22:07:50.889295 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/facts.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:07:50.890153 | orchestrator | 2025-03-22 22:07:50.893785 | orchestrator | TASK [osism.services.docker : Create facts directory] ************************** 2025-03-22 22:07:51.390518 | orchestrator | Saturday 22 March 2025 22:07:50 +0000 (0:00:01.104) 0:07:13.403 ******** 2025-03-22 22:07:51.390642 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:51.890249 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:51.891105 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:51.892003 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:51.893197 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:51.893695 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:51.894425 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:51.895157 | orchestrator | 2025-03-22 22:07:51.895763 | orchestrator | TASK [osism.services.docker : Copy docker fact files] ************************** 2025-03-22 22:07:51.896364 | orchestrator | Saturday 22 March 2025 22:07:51 +0000 (0:00:00.996) 0:07:14.400 ******** 2025-03-22 22:07:54.798562 | orchestrator | ok: [testbed-manager] => (item=docker_containers) 2025-03-22 22:07:54.798875 | orchestrator | changed: [testbed-node-3] => (item=docker_containers) 2025-03-22 22:07:54.798915 | orchestrator | changed: [testbed-node-4] => (item=docker_containers) 2025-03-22 22:07:54.799703 | orchestrator | changed: [testbed-node-5] => (item=docker_containers) 2025-03-22 22:07:54.803440 | orchestrator | changed: [testbed-node-0] => (item=docker_containers) 2025-03-22 22:07:54.803913 | orchestrator | changed: [testbed-node-1] => (item=docker_containers) 2025-03-22 22:07:54.803940 | orchestrator | changed: [testbed-node-2] => (item=docker_containers) 2025-03-22 22:07:54.803980 | orchestrator | ok: [testbed-manager] => (item=docker_images) 2025-03-22 22:07:54.804352 | orchestrator | changed: [testbed-node-3] => (item=docker_images) 2025-03-22 22:07:54.804922 | orchestrator | changed: [testbed-node-4] => (item=docker_images) 2025-03-22 22:07:54.805563 | orchestrator | changed: [testbed-node-0] => (item=docker_images) 2025-03-22 22:07:54.806241 | orchestrator | changed: [testbed-node-5] => (item=docker_images) 2025-03-22 22:07:54.806568 | orchestrator | changed: [testbed-node-1] => (item=docker_images) 2025-03-22 22:07:54.807164 | orchestrator | changed: [testbed-node-2] => (item=docker_images) 2025-03-22 22:07:54.808334 | orchestrator | 2025-03-22 22:07:54.961255 | orchestrator | TASK [osism.commons.docker_compose : This install type is not supported] ******* 2025-03-22 22:07:54.961374 | orchestrator | Saturday 22 March 2025 22:07:54 +0000 (0:00:02.911) 0:07:17.311 ******** 2025-03-22 22:07:54.961402 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:55.033464 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:55.106764 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:55.193774 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:55.267194 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:55.375082 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:55.375215 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:55.375930 | orchestrator | 2025-03-22 22:07:55.376321 | orchestrator | TASK [osism.commons.docker_compose : Include distribution specific install tasks] *** 2025-03-22 22:07:55.376667 | orchestrator | Saturday 22 March 2025 22:07:55 +0000 (0:00:00.576) 0:07:17.888 ******** 2025-03-22 22:07:56.353765 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/docker_compose/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:07:56.353987 | orchestrator | 2025-03-22 22:07:56.354501 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose apt preferences file] *** 2025-03-22 22:07:56.355322 | orchestrator | Saturday 22 March 2025 22:07:56 +0000 (0:00:00.980) 0:07:18.868 ******** 2025-03-22 22:07:56.854744 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:57.302246 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:57.303048 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:57.303469 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:57.304669 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:57.305136 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:57.305947 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:57.307219 | orchestrator | 2025-03-22 22:07:57.307961 | orchestrator | TASK [osism.commons.docker_compose : Get checksum of docker-compose file] ****** 2025-03-22 22:07:57.308512 | orchestrator | Saturday 22 March 2025 22:07:57 +0000 (0:00:00.946) 0:07:19.815 ******** 2025-03-22 22:07:57.800866 | orchestrator | ok: [testbed-manager] 2025-03-22 22:07:58.522591 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:07:58.523176 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:07:58.525379 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:07:58.526729 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:07:58.527238 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:07:58.528470 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:07:58.529323 | orchestrator | 2025-03-22 22:07:58.530334 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose binary] ************* 2025-03-22 22:07:58.530880 | orchestrator | Saturday 22 March 2025 22:07:58 +0000 (0:00:01.221) 0:07:21.037 ******** 2025-03-22 22:07:58.662576 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:07:58.748329 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:07:58.816535 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:07:58.885411 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:07:58.975883 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:07:59.085806 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:07:59.086363 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:07:59.086631 | orchestrator | 2025-03-22 22:07:59.087927 | orchestrator | TASK [osism.commons.docker_compose : Uninstall docker-compose package] ********* 2025-03-22 22:07:59.088111 | orchestrator | Saturday 22 March 2025 22:07:59 +0000 (0:00:00.562) 0:07:21.599 ******** 2025-03-22 22:08:00.972771 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:00.973082 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:00.973117 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:00.973750 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:00.973915 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:00.974155 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:00.974512 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:00.974861 | orchestrator | 2025-03-22 22:08:00.975848 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose script] *************** 2025-03-22 22:08:00.977207 | orchestrator | Saturday 22 March 2025 22:08:00 +0000 (0:00:01.885) 0:07:23.485 ******** 2025-03-22 22:08:01.139422 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:08:01.221243 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:08:01.308474 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:08:01.385815 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:08:01.464479 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:08:01.587922 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:08:01.588073 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:08:01.588104 | orchestrator | 2025-03-22 22:08:01.588593 | orchestrator | TASK [osism.commons.docker_compose : Install docker-compose-plugin package] **** 2025-03-22 22:08:01.588678 | orchestrator | Saturday 22 March 2025 22:08:01 +0000 (0:00:00.618) 0:07:24.103 ******** 2025-03-22 22:08:03.939080 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:03.939241 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:03.939872 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:03.940635 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:03.941083 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:03.941565 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:03.942434 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:03.942963 | orchestrator | 2025-03-22 22:08:03.943225 | orchestrator | TASK [osism.commons.docker_compose : Copy osism.target systemd file] *********** 2025-03-22 22:08:03.944187 | orchestrator | Saturday 22 March 2025 22:08:03 +0000 (0:00:02.348) 0:07:26.452 ******** 2025-03-22 22:08:05.648799 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:05.649406 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:05.649570 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:05.650413 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:05.650621 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:05.651583 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:05.652424 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:05.652455 | orchestrator | 2025-03-22 22:08:05.652704 | orchestrator | TASK [osism.commons.docker_compose : Enable osism.target] ********************** 2025-03-22 22:08:05.653309 | orchestrator | Saturday 22 March 2025 22:08:05 +0000 (0:00:01.708) 0:07:28.160 ******** 2025-03-22 22:08:07.533470 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:07.533908 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:07.536385 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:07.537065 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:07.537117 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:07.538708 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:07.538952 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:07.538986 | orchestrator | 2025-03-22 22:08:07.539179 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose systemd unit file] **** 2025-03-22 22:08:07.539457 | orchestrator | Saturday 22 March 2025 22:08:07 +0000 (0:00:01.886) 0:07:30.046 ******** 2025-03-22 22:08:09.329309 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:09.329511 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:09.329545 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:09.330280 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:09.330718 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:09.331257 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:09.331556 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:09.332638 | orchestrator | 2025-03-22 22:08:09.333116 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-03-22 22:08:09.333420 | orchestrator | Saturday 22 March 2025 22:08:09 +0000 (0:00:01.795) 0:07:31.842 ******** 2025-03-22 22:08:10.054081 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:10.173693 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:10.627210 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:10.627723 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:10.627757 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:10.630461 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:10.781738 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:10.781869 | orchestrator | 2025-03-22 22:08:10.781931 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-03-22 22:08:10.781951 | orchestrator | Saturday 22 March 2025 22:08:10 +0000 (0:00:01.296) 0:07:33.138 ******** 2025-03-22 22:08:10.781983 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:08:10.864986 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:08:10.938009 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:08:11.007196 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:08:11.083840 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:08:11.545951 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:08:11.546183 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:08:11.546645 | orchestrator | 2025-03-22 22:08:11.547192 | orchestrator | TASK [osism.services.chrony : Check minimum and maximum number of servers] ***** 2025-03-22 22:08:11.548104 | orchestrator | Saturday 22 March 2025 22:08:11 +0000 (0:00:00.920) 0:07:34.059 ******** 2025-03-22 22:08:11.722897 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:08:11.797647 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:08:11.877317 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:08:11.973870 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:08:12.067412 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:08:12.184511 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:08:12.185847 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:08:12.187019 | orchestrator | 2025-03-22 22:08:12.187619 | orchestrator | TASK [osism.services.chrony : Gather variables for each operating system] ****** 2025-03-22 22:08:12.188512 | orchestrator | Saturday 22 March 2025 22:08:12 +0000 (0:00:00.640) 0:07:34.700 ******** 2025-03-22 22:08:12.350530 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:12.428491 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:12.504509 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:12.579068 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:12.663617 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:12.774355 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:12.774512 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:12.775781 | orchestrator | 2025-03-22 22:08:12.776901 | orchestrator | TASK [osism.services.chrony : Set chrony_conf_file variable to default value] *** 2025-03-22 22:08:12.777134 | orchestrator | Saturday 22 March 2025 22:08:12 +0000 (0:00:00.588) 0:07:35.288 ******** 2025-03-22 22:08:12.950708 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:13.278958 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:13.355986 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:13.444779 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:13.566648 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:13.690713 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:13.691683 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:13.693148 | orchestrator | 2025-03-22 22:08:13.694109 | orchestrator | TASK [osism.services.chrony : Set chrony_key_file variable to default value] *** 2025-03-22 22:08:13.695176 | orchestrator | Saturday 22 March 2025 22:08:13 +0000 (0:00:00.914) 0:07:36.203 ******** 2025-03-22 22:08:13.872422 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:13.948961 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:14.036557 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:14.134386 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:14.221290 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:14.342390 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:14.343447 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:14.344079 | orchestrator | 2025-03-22 22:08:14.344794 | orchestrator | TASK [osism.services.chrony : Populate service facts] ************************** 2025-03-22 22:08:14.345250 | orchestrator | Saturday 22 March 2025 22:08:14 +0000 (0:00:00.651) 0:07:36.854 ******** 2025-03-22 22:08:19.280186 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:19.280917 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:19.280948 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:19.280972 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:19.281429 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:19.282288 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:19.283013 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:19.286640 | orchestrator | 2025-03-22 22:08:19.458733 | orchestrator | TASK [osism.services.chrony : Manage timesyncd service] ************************ 2025-03-22 22:08:19.458801 | orchestrator | Saturday 22 March 2025 22:08:19 +0000 (0:00:04.941) 0:07:41.796 ******** 2025-03-22 22:08:19.458827 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:08:19.548647 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:08:19.638404 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:08:19.714959 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:08:19.801537 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:08:19.920061 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:08:19.920646 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:08:19.921495 | orchestrator | 2025-03-22 22:08:19.922111 | orchestrator | TASK [osism.services.chrony : Include distribution specific install tasks] ***** 2025-03-22 22:08:19.922587 | orchestrator | Saturday 22 March 2025 22:08:19 +0000 (0:00:00.638) 0:07:42.434 ******** 2025-03-22 22:08:21.146982 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:08:21.147155 | orchestrator | 2025-03-22 22:08:21.147452 | orchestrator | TASK [osism.services.chrony : Install package] ********************************* 2025-03-22 22:08:21.147800 | orchestrator | Saturday 22 March 2025 22:08:21 +0000 (0:00:01.226) 0:07:43.660 ******** 2025-03-22 22:08:23.303724 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:23.304340 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:23.306872 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:23.307704 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:23.309878 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:23.311493 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:23.313031 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:23.315671 | orchestrator | 2025-03-22 22:08:24.526178 | orchestrator | TASK [osism.services.chrony : Manage chrony service] *************************** 2025-03-22 22:08:24.526384 | orchestrator | Saturday 22 March 2025 22:08:23 +0000 (0:00:02.155) 0:07:45.816 ******** 2025-03-22 22:08:24.526422 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:24.526540 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:24.526564 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:24.526580 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:24.526602 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:24.526800 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:24.526831 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:24.527454 | orchestrator | 2025-03-22 22:08:24.528544 | orchestrator | TASK [osism.services.chrony : Check if configuration file exists] ************** 2025-03-22 22:08:24.528799 | orchestrator | Saturday 22 March 2025 22:08:24 +0000 (0:00:01.222) 0:07:47.038 ******** 2025-03-22 22:08:25.017940 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:25.473880 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:25.474120 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:25.475580 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:25.478239 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:27.626352 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:27.626466 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:27.626486 | orchestrator | 2025-03-22 22:08:27.626505 | orchestrator | TASK [osism.services.chrony : Copy configuration file] ************************* 2025-03-22 22:08:27.626521 | orchestrator | Saturday 22 March 2025 22:08:25 +0000 (0:00:00.945) 0:07:47.984 ******** 2025-03-22 22:08:27.626551 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-22 22:08:27.626976 | orchestrator | changed: [testbed-node-3] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-22 22:08:27.627931 | orchestrator | changed: [testbed-node-4] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-22 22:08:27.628800 | orchestrator | changed: [testbed-node-5] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-22 22:08:27.629935 | orchestrator | changed: [testbed-node-0] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-22 22:08:27.630965 | orchestrator | changed: [testbed-node-1] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-22 22:08:27.631235 | orchestrator | changed: [testbed-node-2] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-03-22 22:08:27.633563 | orchestrator | 2025-03-22 22:08:28.538134 | orchestrator | TASK [osism.services.lldpd : Include distribution specific install tasks] ****** 2025-03-22 22:08:28.538244 | orchestrator | Saturday 22 March 2025 22:08:27 +0000 (0:00:02.154) 0:07:50.138 ******** 2025-03-22 22:08:28.538330 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/lldpd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:08:28.538909 | orchestrator | 2025-03-22 22:08:28.542542 | orchestrator | TASK [osism.services.lldpd : Install lldpd package] **************************** 2025-03-22 22:08:38.165619 | orchestrator | Saturday 22 March 2025 22:08:28 +0000 (0:00:00.911) 0:07:51.049 ******** 2025-03-22 22:08:38.165789 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:38.165867 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:38.166187 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:38.166526 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:38.167155 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:38.167730 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:38.167760 | orchestrator | changed: [testbed-manager] 2025-03-22 22:08:38.168421 | orchestrator | 2025-03-22 22:08:38.169506 | orchestrator | TASK [osism.services.lldpd : Manage lldpd service] ***************************** 2025-03-22 22:08:38.170383 | orchestrator | Saturday 22 March 2025 22:08:38 +0000 (0:00:09.630) 0:08:00.680 ******** 2025-03-22 22:08:40.485312 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:40.485832 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:40.486847 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:40.487717 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:40.488816 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:40.490288 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:40.491107 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:40.491710 | orchestrator | 2025-03-22 22:08:40.492338 | orchestrator | RUNNING HANDLER [osism.commons.docker_compose : Reload systemd daemon] ********* 2025-03-22 22:08:40.492655 | orchestrator | Saturday 22 March 2025 22:08:40 +0000 (0:00:02.319) 0:08:02.999 ******** 2025-03-22 22:08:42.136801 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:42.137469 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:42.137801 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:42.138661 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:42.139352 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:42.139421 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:42.140246 | orchestrator | 2025-03-22 22:08:42.140678 | orchestrator | RUNNING HANDLER [osism.services.chrony : Restart chrony service] *************** 2025-03-22 22:08:42.140926 | orchestrator | Saturday 22 March 2025 22:08:42 +0000 (0:00:01.652) 0:08:04.652 ******** 2025-03-22 22:08:43.723968 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:43.724729 | orchestrator | changed: [testbed-manager] 2025-03-22 22:08:43.726337 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:43.727387 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:43.730656 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:43.731070 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:43.731116 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:43.732701 | orchestrator | 2025-03-22 22:08:43.733387 | orchestrator | PLAY [Apply bootstrap role part 2] ********************************************* 2025-03-22 22:08:43.735528 | orchestrator | 2025-03-22 22:08:43.736832 | orchestrator | TASK [Include hardening role] ************************************************** 2025-03-22 22:08:43.736867 | orchestrator | Saturday 22 March 2025 22:08:43 +0000 (0:00:01.586) 0:08:06.238 ******** 2025-03-22 22:08:43.866302 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:08:43.952141 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:08:44.037554 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:08:44.134575 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:08:44.226629 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:08:44.378319 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:08:44.378478 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:08:44.378557 | orchestrator | 2025-03-22 22:08:44.380353 | orchestrator | PLAY [Apply bootstrap roles part 3] ******************************************** 2025-03-22 22:08:44.380496 | orchestrator | 2025-03-22 22:08:44.380829 | orchestrator | TASK [osism.services.journald : Copy configuration file] *********************** 2025-03-22 22:08:44.381657 | orchestrator | Saturday 22 March 2025 22:08:44 +0000 (0:00:00.654) 0:08:06.892 ******** 2025-03-22 22:08:45.828799 | orchestrator | changed: [testbed-manager] 2025-03-22 22:08:45.829317 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:45.829607 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:45.830444 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:45.831784 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:45.833858 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:45.834945 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:45.837389 | orchestrator | 2025-03-22 22:08:45.838132 | orchestrator | TASK [osism.services.journald : Manage journald service] *********************** 2025-03-22 22:08:45.838446 | orchestrator | Saturday 22 March 2025 22:08:45 +0000 (0:00:01.450) 0:08:08.343 ******** 2025-03-22 22:08:47.556904 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:47.557124 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:47.557771 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:47.559435 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:47.562638 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:47.563657 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:47.564339 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:47.565033 | orchestrator | 2025-03-22 22:08:47.565675 | orchestrator | TASK [Include auditd role] ***************************************************** 2025-03-22 22:08:47.566535 | orchestrator | Saturday 22 March 2025 22:08:47 +0000 (0:00:01.726) 0:08:10.069 ******** 2025-03-22 22:08:47.696309 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:08:48.045166 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:08:48.111276 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:08:48.193805 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:08:48.275876 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:08:48.743681 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:08:48.745345 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:08:48.746696 | orchestrator | 2025-03-22 22:08:48.751021 | orchestrator | RUNNING HANDLER [osism.services.journald : Restart journald service] *********** 2025-03-22 22:08:48.752510 | orchestrator | Saturday 22 March 2025 22:08:48 +0000 (0:00:01.188) 0:08:11.257 ******** 2025-03-22 22:08:50.190987 | orchestrator | changed: [testbed-manager] 2025-03-22 22:08:50.191611 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:50.191827 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:50.192081 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:50.192498 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:50.194178 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:50.194720 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:50.195111 | orchestrator | 2025-03-22 22:08:50.197002 | orchestrator | PLAY [Set state bootstrap] ***************************************************** 2025-03-22 22:08:50.197725 | orchestrator | 2025-03-22 22:08:50.198327 | orchestrator | TASK [Set osism.bootstrap.status fact] ***************************************** 2025-03-22 22:08:50.199489 | orchestrator | Saturday 22 March 2025 22:08:50 +0000 (0:00:01.442) 0:08:12.700 ******** 2025-03-22 22:08:51.130382 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:08:51.132529 | orchestrator | 2025-03-22 22:08:51.133448 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-03-22 22:08:51.134593 | orchestrator | Saturday 22 March 2025 22:08:51 +0000 (0:00:00.941) 0:08:13.642 ******** 2025-03-22 22:08:51.606438 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:52.284106 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:52.284529 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:52.285704 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:52.288076 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:52.288988 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:52.289771 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:52.290808 | orchestrator | 2025-03-22 22:08:52.292065 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-03-22 22:08:52.294362 | orchestrator | Saturday 22 March 2025 22:08:52 +0000 (0:00:01.156) 0:08:14.799 ******** 2025-03-22 22:08:53.590947 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:53.591526 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:53.593198 | orchestrator | changed: [testbed-manager] 2025-03-22 22:08:53.593938 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:53.594952 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:53.595360 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:53.596532 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:53.597324 | orchestrator | 2025-03-22 22:08:53.598476 | orchestrator | TASK [Set osism.bootstrap.timestamp fact] ************************************** 2025-03-22 22:08:53.599721 | orchestrator | Saturday 22 March 2025 22:08:53 +0000 (0:00:01.303) 0:08:16.102 ******** 2025-03-22 22:08:54.804782 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 22:08:54.805151 | orchestrator | 2025-03-22 22:08:54.805886 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-03-22 22:08:54.807114 | orchestrator | Saturday 22 March 2025 22:08:54 +0000 (0:00:01.215) 0:08:17.318 ******** 2025-03-22 22:08:55.317202 | orchestrator | ok: [testbed-manager] 2025-03-22 22:08:55.781885 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:08:55.782145 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:08:55.782924 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:08:55.783538 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:08:55.784111 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:08:55.784450 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:08:55.785421 | orchestrator | 2025-03-22 22:08:55.785764 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-03-22 22:08:55.786441 | orchestrator | Saturday 22 March 2025 22:08:55 +0000 (0:00:00.975) 0:08:18.294 ******** 2025-03-22 22:08:56.308852 | orchestrator | changed: [testbed-manager] 2025-03-22 22:08:57.110510 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:08:57.110666 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:08:57.112971 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:08:57.113061 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:08:57.113099 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:08:57.113118 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:08:57.114151 | orchestrator | 2025-03-22 22:08:57.115044 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:08:57.115501 | orchestrator | 2025-03-22 22:08:57 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:08:57.116358 | orchestrator | 2025-03-22 22:08:57 | INFO  | Please wait and do not abort execution. 2025-03-22 22:08:57.117557 | orchestrator | testbed-manager : ok=160  changed=38  unreachable=0 failed=0 skipped=41  rescued=0 ignored=0 2025-03-22 22:08:57.118938 | orchestrator | testbed-node-0 : ok=168  changed=65  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-22 22:08:57.119832 | orchestrator | testbed-node-1 : ok=168  changed=65  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-22 22:08:57.120608 | orchestrator | testbed-node-2 : ok=168  changed=65  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-22 22:08:57.121315 | orchestrator | testbed-node-3 : ok=167  changed=62  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-03-22 22:08:57.122599 | orchestrator | testbed-node-4 : ok=167  changed=62  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-22 22:08:57.122888 | orchestrator | testbed-node-5 : ok=167  changed=62  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-03-22 22:08:57.123677 | orchestrator | 2025-03-22 22:08:57.124742 | orchestrator | Saturday 22 March 2025 22:08:57 +0000 (0:00:01.330) 0:08:19.624 ******** 2025-03-22 22:08:57.126012 | orchestrator | =============================================================================== 2025-03-22 22:08:57.126750 | orchestrator | osism.commons.packages : Install required packages --------------------- 72.85s 2025-03-22 22:08:57.126783 | orchestrator | osism.commons.packages : Download required packages -------------------- 37.32s 2025-03-22 22:08:57.127643 | orchestrator | osism.commons.cleanup : Cleanup installed packages --------------------- 31.37s 2025-03-22 22:08:57.128217 | orchestrator | osism.commons.repository : Update package cache ------------------------ 14.27s 2025-03-22 22:08:57.129142 | orchestrator | osism.commons.packages : Remove dependencies that are no longer required -- 13.61s 2025-03-22 22:08:57.129867 | orchestrator | osism.services.docker : Install docker-cli package --------------------- 13.06s 2025-03-22 22:08:57.131090 | orchestrator | osism.services.docker : Install docker package ------------------------- 12.91s 2025-03-22 22:08:57.131521 | orchestrator | osism.commons.systohc : Install util-linux-extra package --------------- 11.81s 2025-03-22 22:08:57.131941 | orchestrator | osism.commons.packages : Upgrade packages ------------------------------ 10.39s 2025-03-22 22:08:57.132293 | orchestrator | osism.services.docker : Install containerd package --------------------- 10.17s 2025-03-22 22:08:57.133268 | orchestrator | osism.services.lldpd : Install lldpd package ---------------------------- 9.63s 2025-03-22 22:08:57.134007 | orchestrator | osism.services.smartd : Install smartmontools package ------------------- 9.35s 2025-03-22 22:08:57.135018 | orchestrator | osism.services.docker : Add repository ---------------------------------- 8.14s 2025-03-22 22:08:57.135721 | orchestrator | osism.commons.cleanup : Remove cloudinit package ------------------------ 8.05s 2025-03-22 22:08:57.135751 | orchestrator | osism.services.rng : Install rng package -------------------------------- 7.99s 2025-03-22 22:08:57.136275 | orchestrator | osism.commons.cleanup : Uninstall unattended-upgrades package ----------- 7.68s 2025-03-22 22:08:57.136942 | orchestrator | osism.services.docker : Ensure that some packages are not installed ----- 7.03s 2025-03-22 22:08:57.137350 | orchestrator | osism.commons.sysctl : Set sysctl parameters on rabbitmq ---------------- 7.01s 2025-03-22 22:08:57.138392 | orchestrator | osism.services.docker : Install apt-transport-https package ------------- 6.63s 2025-03-22 22:08:57.138859 | orchestrator | osism.commons.cleanup : Remove dependencies that are no longer required --- 6.16s 2025-03-22 22:08:58.035528 | orchestrator | + [[ -e /etc/redhat-release ]] 2025-03-22 22:09:00.449083 | orchestrator | + osism apply network 2025-03-22 22:09:00.449216 | orchestrator | 2025-03-22 22:09:00 | INFO  | Task bf63d4b8-7fe8-4c59-8918-2d69761d99df (network) was prepared for execution. 2025-03-22 22:09:04.561594 | orchestrator | 2025-03-22 22:09:00 | INFO  | It takes a moment until task bf63d4b8-7fe8-4c59-8918-2d69761d99df (network) has been started and output is visible here. 2025-03-22 22:09:04.561780 | orchestrator | 2025-03-22 22:09:04.562178 | orchestrator | PLAY [Apply role network] ****************************************************** 2025-03-22 22:09:04.562217 | orchestrator | 2025-03-22 22:09:04.562311 | orchestrator | TASK [osism.commons.network : Gather variables for each operating system] ****** 2025-03-22 22:09:04.563766 | orchestrator | Saturday 22 March 2025 22:09:04 +0000 (0:00:00.254) 0:00:00.254 ******** 2025-03-22 22:09:04.788237 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:04.893888 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:09:04.983953 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:09:05.078811 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:09:05.167399 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:09:05.486175 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:09:05.486356 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:09:05.486714 | orchestrator | 2025-03-22 22:09:05.487186 | orchestrator | TASK [osism.commons.network : Include type specific tasks] ********************* 2025-03-22 22:09:05.487681 | orchestrator | Saturday 22 March 2025 22:09:05 +0000 (0:00:00.923) 0:00:01.178 ******** 2025-03-22 22:09:06.841276 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/netplan-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 22:09:06.841433 | orchestrator | 2025-03-22 22:09:06.841686 | orchestrator | TASK [osism.commons.network : Install required packages] *********************** 2025-03-22 22:09:06.842377 | orchestrator | Saturday 22 March 2025 22:09:06 +0000 (0:00:01.355) 0:00:02.534 ******** 2025-03-22 22:09:09.008105 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:09:09.009400 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:09:09.011086 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:09.011645 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:09:09.012531 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:09:09.013357 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:09:09.017311 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:09:09.017878 | orchestrator | 2025-03-22 22:09:09.018939 | orchestrator | TASK [osism.commons.network : Remove ifupdown package] ************************* 2025-03-22 22:09:09.019949 | orchestrator | Saturday 22 March 2025 22:09:08 +0000 (0:00:02.166) 0:00:04.700 ******** 2025-03-22 22:09:11.077035 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:11.079590 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:09:11.079610 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:09:11.079620 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:09:11.080294 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:09:11.080308 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:09:11.080850 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:09:11.081306 | orchestrator | 2025-03-22 22:09:11.081934 | orchestrator | TASK [osism.commons.network : Create required directories] ********************* 2025-03-22 22:09:11.082308 | orchestrator | Saturday 22 March 2025 22:09:11 +0000 (0:00:02.068) 0:00:06.769 ******** 2025-03-22 22:09:11.709018 | orchestrator | ok: [testbed-manager] => (item=/etc/netplan) 2025-03-22 22:09:11.709156 | orchestrator | ok: [testbed-node-0] => (item=/etc/netplan) 2025-03-22 22:09:12.438458 | orchestrator | ok: [testbed-node-1] => (item=/etc/netplan) 2025-03-22 22:09:12.438934 | orchestrator | ok: [testbed-node-2] => (item=/etc/netplan) 2025-03-22 22:09:12.439469 | orchestrator | ok: [testbed-node-3] => (item=/etc/netplan) 2025-03-22 22:09:12.440118 | orchestrator | ok: [testbed-node-4] => (item=/etc/netplan) 2025-03-22 22:09:12.440886 | orchestrator | ok: [testbed-node-5] => (item=/etc/netplan) 2025-03-22 22:09:12.441422 | orchestrator | 2025-03-22 22:09:12.441962 | orchestrator | TASK [osism.commons.network : Prepare netplan configuration template] ********** 2025-03-22 22:09:12.442664 | orchestrator | Saturday 22 March 2025 22:09:12 +0000 (0:00:01.360) 0:00:08.129 ******** 2025-03-22 22:09:14.486325 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-22 22:09:14.486599 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-03-22 22:09:14.486847 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-03-22 22:09:14.487100 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-03-22 22:09:14.491044 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-03-22 22:09:14.491143 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-03-22 22:09:14.491150 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-03-22 22:09:14.491154 | orchestrator | 2025-03-22 22:09:14.491161 | orchestrator | TASK [osism.commons.network : Copy netplan configuration] ********************** 2025-03-22 22:09:14.492318 | orchestrator | Saturday 22 March 2025 22:09:14 +0000 (0:00:02.052) 0:00:10.182 ******** 2025-03-22 22:09:16.373003 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:16.373478 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:09:16.373490 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:09:16.373723 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:09:16.374259 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:09:16.374945 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:09:16.375041 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:09:16.375678 | orchestrator | 2025-03-22 22:09:16.376104 | orchestrator | TASK [osism.commons.network : Remove netplan configuration template] *********** 2025-03-22 22:09:16.376587 | orchestrator | Saturday 22 March 2025 22:09:16 +0000 (0:00:01.881) 0:00:12.063 ******** 2025-03-22 22:09:17.008920 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-22 22:09:17.132784 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-03-22 22:09:17.736224 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-03-22 22:09:17.736810 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-03-22 22:09:17.737655 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-03-22 22:09:17.738278 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-03-22 22:09:17.738983 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-03-22 22:09:17.739822 | orchestrator | 2025-03-22 22:09:17.740589 | orchestrator | TASK [osism.commons.network : Check if path for interface file exists] ********* 2025-03-22 22:09:17.743673 | orchestrator | Saturday 22 March 2025 22:09:17 +0000 (0:00:01.368) 0:00:13.432 ******** 2025-03-22 22:09:18.248781 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:18.358892 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:09:19.030516 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:09:19.030705 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:09:19.030734 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:09:19.032011 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:09:19.032385 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:09:19.033231 | orchestrator | 2025-03-22 22:09:19.034105 | orchestrator | TASK [osism.commons.network : Copy interfaces file] **************************** 2025-03-22 22:09:19.034959 | orchestrator | Saturday 22 March 2025 22:09:19 +0000 (0:00:01.288) 0:00:14.721 ******** 2025-03-22 22:09:19.259378 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:09:19.368836 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:09:19.455900 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:09:19.548754 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:09:19.640525 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:09:19.983556 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:09:19.983994 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:09:19.984034 | orchestrator | 2025-03-22 22:09:19.985808 | orchestrator | TASK [osism.commons.network : Install package networkd-dispatcher] ************* 2025-03-22 22:09:19.986512 | orchestrator | Saturday 22 March 2025 22:09:19 +0000 (0:00:00.955) 0:00:15.677 ******** 2025-03-22 22:09:22.290913 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:22.292188 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:09:22.292521 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:09:22.292943 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:09:22.294505 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:09:22.295792 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:09:22.296444 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:09:22.297040 | orchestrator | 2025-03-22 22:09:22.298086 | orchestrator | TASK [osism.commons.network : Copy dispatcher scripts] ************************* 2025-03-22 22:09:22.298455 | orchestrator | Saturday 22 March 2025 22:09:22 +0000 (0:00:02.309) 0:00:17.986 ******** 2025-03-22 22:09:24.339384 | orchestrator | changed: [testbed-manager] => (item={'dest': 'routable.d/iptables.sh', 'src': '/opt/configuration/network/iptables.sh'}) 2025-03-22 22:09:24.339525 | orchestrator | changed: [testbed-node-0] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-22 22:09:24.340529 | orchestrator | changed: [testbed-node-1] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-22 22:09:24.340890 | orchestrator | changed: [testbed-node-2] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-22 22:09:24.341120 | orchestrator | changed: [testbed-node-3] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-22 22:09:24.342135 | orchestrator | changed: [testbed-node-4] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-22 22:09:24.342776 | orchestrator | changed: [testbed-manager] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-22 22:09:24.343214 | orchestrator | changed: [testbed-node-5] => (item={'dest': 'routable.d/vxlan.sh', 'src': '/opt/configuration/network/vxlan.sh'}) 2025-03-22 22:09:24.343655 | orchestrator | 2025-03-22 22:09:24.345382 | orchestrator | TASK [osism.commons.network : Manage service networkd-dispatcher] ************** 2025-03-22 22:09:24.345611 | orchestrator | Saturday 22 March 2025 22:09:24 +0000 (0:00:02.045) 0:00:20.031 ******** 2025-03-22 22:09:26.164060 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:26.164215 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:09:26.165474 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:09:26.166294 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:09:26.167218 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:09:26.170869 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:09:26.171376 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:09:26.172183 | orchestrator | 2025-03-22 22:09:26.172835 | orchestrator | TASK [osism.commons.network : Include cleanup tasks] *************************** 2025-03-22 22:09:26.173411 | orchestrator | Saturday 22 March 2025 22:09:26 +0000 (0:00:01.827) 0:00:21.859 ******** 2025-03-22 22:09:27.826655 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-netplan.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 22:09:27.828008 | orchestrator | 2025-03-22 22:09:27.830278 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2025-03-22 22:09:27.831179 | orchestrator | Saturday 22 March 2025 22:09:27 +0000 (0:00:01.658) 0:00:23.518 ******** 2025-03-22 22:09:28.484126 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:09:28.952223 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:28.952707 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:09:28.953795 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:09:28.954495 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:09:28.955053 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:09:28.955545 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:09:28.956169 | orchestrator | 2025-03-22 22:09:28.956630 | orchestrator | TASK [osism.commons.network : Set network_configured_files fact] *************** 2025-03-22 22:09:28.959690 | orchestrator | Saturday 22 March 2025 22:09:28 +0000 (0:00:01.129) 0:00:24.648 ******** 2025-03-22 22:09:29.162167 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:29.280761 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:09:29.576567 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:09:29.715891 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:09:29.819842 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:09:29.974528 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:09:29.974608 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:09:29.975410 | orchestrator | 2025-03-22 22:09:29.975925 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2025-03-22 22:09:29.979656 | orchestrator | Saturday 22 March 2025 22:09:29 +0000 (0:00:01.017) 0:00:25.665 ******** 2025-03-22 22:09:30.394009 | orchestrator | changed: [testbed-manager] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-22 22:09:30.394348 | orchestrator | skipping: [testbed-manager] => (item=/etc/netplan/01-osism.yaml)  2025-03-22 22:09:30.549259 | orchestrator | changed: [testbed-node-0] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-22 22:09:30.672245 | orchestrator | skipping: [testbed-node-0] => (item=/etc/netplan/01-osism.yaml)  2025-03-22 22:09:30.672311 | orchestrator | changed: [testbed-node-1] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-22 22:09:30.672327 | orchestrator | skipping: [testbed-node-1] => (item=/etc/netplan/01-osism.yaml)  2025-03-22 22:09:31.192223 | orchestrator | changed: [testbed-node-2] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-22 22:09:31.192425 | orchestrator | skipping: [testbed-node-2] => (item=/etc/netplan/01-osism.yaml)  2025-03-22 22:09:31.193464 | orchestrator | changed: [testbed-node-3] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-22 22:09:31.194855 | orchestrator | skipping: [testbed-node-3] => (item=/etc/netplan/01-osism.yaml)  2025-03-22 22:09:31.195815 | orchestrator | changed: [testbed-node-4] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-22 22:09:31.196614 | orchestrator | skipping: [testbed-node-4] => (item=/etc/netplan/01-osism.yaml)  2025-03-22 22:09:31.197101 | orchestrator | changed: [testbed-node-5] => (item=/etc/netplan/50-cloud-init.yaml) 2025-03-22 22:09:31.198012 | orchestrator | skipping: [testbed-node-5] => (item=/etc/netplan/01-osism.yaml)  2025-03-22 22:09:31.198566 | orchestrator | 2025-03-22 22:09:31.199373 | orchestrator | TASK [osism.commons.network : Include dummy interfaces] ************************ 2025-03-22 22:09:31.201116 | orchestrator | Saturday 22 March 2025 22:09:31 +0000 (0:00:01.222) 0:00:26.888 ******** 2025-03-22 22:09:31.578106 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:09:31.677569 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:09:31.792044 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:09:31.889118 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:09:31.981415 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:09:33.321040 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:09:33.321191 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:09:33.322008 | orchestrator | 2025-03-22 22:09:33.322720 | orchestrator | RUNNING HANDLER [osism.commons.network : Netplan configuration changed] ******** 2025-03-22 22:09:33.323644 | orchestrator | Saturday 22 March 2025 22:09:33 +0000 (0:00:02.126) 0:00:29.014 ******** 2025-03-22 22:09:33.494464 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:09:33.589896 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:09:33.900597 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:09:33.998147 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:09:34.133360 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:09:34.173936 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:09:34.174610 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:09:34.175132 | orchestrator | 2025-03-22 22:09:34.175654 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:09:34.176214 | orchestrator | 2025-03-22 22:09:34 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:09:34.176451 | orchestrator | 2025-03-22 22:09:34 | INFO  | Please wait and do not abort execution. 2025-03-22 22:09:34.177359 | orchestrator | testbed-manager : ok=16  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 22:09:34.178375 | orchestrator | testbed-node-0 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 22:09:34.179131 | orchestrator | testbed-node-1 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 22:09:34.179987 | orchestrator | testbed-node-2 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 22:09:34.180284 | orchestrator | testbed-node-3 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 22:09:34.180651 | orchestrator | testbed-node-4 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 22:09:34.181176 | orchestrator | testbed-node-5 : ok=16  changed=4  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 22:09:34.181607 | orchestrator | 2025-03-22 22:09:34.182108 | orchestrator | Saturday 22 March 2025 22:09:34 +0000 (0:00:00.856) 0:00:29.870 ******** 2025-03-22 22:09:34.182640 | orchestrator | =============================================================================== 2025-03-22 22:09:34.183286 | orchestrator | osism.commons.network : Install package networkd-dispatcher ------------- 2.31s 2025-03-22 22:09:34.184042 | orchestrator | osism.commons.network : Install required packages ----------------------- 2.17s 2025-03-22 22:09:34.184148 | orchestrator | osism.commons.network : Include dummy interfaces ------------------------ 2.13s 2025-03-22 22:09:34.185374 | orchestrator | osism.commons.network : Remove ifupdown package ------------------------- 2.07s 2025-03-22 22:09:34.185783 | orchestrator | osism.commons.network : Prepare netplan configuration template ---------- 2.05s 2025-03-22 22:09:34.186659 | orchestrator | osism.commons.network : Copy dispatcher scripts ------------------------- 2.05s 2025-03-22 22:09:34.187586 | orchestrator | osism.commons.network : Copy netplan configuration ---------------------- 1.88s 2025-03-22 22:09:34.187987 | orchestrator | osism.commons.network : Manage service networkd-dispatcher -------------- 1.83s 2025-03-22 22:09:34.188646 | orchestrator | osism.commons.network : Include cleanup tasks --------------------------- 1.66s 2025-03-22 22:09:34.189004 | orchestrator | osism.commons.network : Remove netplan configuration template ----------- 1.37s 2025-03-22 22:09:34.189335 | orchestrator | osism.commons.network : Create required directories --------------------- 1.36s 2025-03-22 22:09:34.189879 | orchestrator | osism.commons.network : Include type specific tasks --------------------- 1.36s 2025-03-22 22:09:34.190422 | orchestrator | osism.commons.network : Check if path for interface file exists --------- 1.29s 2025-03-22 22:09:34.190778 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.22s 2025-03-22 22:09:34.191254 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.13s 2025-03-22 22:09:34.191698 | orchestrator | osism.commons.network : Set network_configured_files fact --------------- 1.02s 2025-03-22 22:09:34.192188 | orchestrator | osism.commons.network : Copy interfaces file ---------------------------- 0.96s 2025-03-22 22:09:34.192541 | orchestrator | osism.commons.network : Gather variables for each operating system ------ 0.92s 2025-03-22 22:09:34.193198 | orchestrator | osism.commons.network : Netplan configuration changed ------------------- 0.86s 2025-03-22 22:09:34.885942 | orchestrator | + osism apply wireguard 2025-03-22 22:09:36.535077 | orchestrator | 2025-03-22 22:09:36 | INFO  | Task 5209d69b-e41f-4e07-8356-19a8a9b83f65 (wireguard) was prepared for execution. 2025-03-22 22:09:40.159936 | orchestrator | 2025-03-22 22:09:36 | INFO  | It takes a moment until task 5209d69b-e41f-4e07-8356-19a8a9b83f65 (wireguard) has been started and output is visible here. 2025-03-22 22:09:40.160056 | orchestrator | 2025-03-22 22:09:40.161971 | orchestrator | PLAY [Apply role wireguard] **************************************************** 2025-03-22 22:09:40.162596 | orchestrator | 2025-03-22 22:09:40.164042 | orchestrator | TASK [osism.services.wireguard : Install iptables package] ********************* 2025-03-22 22:09:40.164789 | orchestrator | Saturday 22 March 2025 22:09:40 +0000 (0:00:00.181) 0:00:00.181 ******** 2025-03-22 22:09:41.974844 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:41.976924 | orchestrator | 2025-03-22 22:09:41.976963 | orchestrator | TASK [osism.services.wireguard : Install wireguard package] ******************** 2025-03-22 22:09:41.977169 | orchestrator | Saturday 22 March 2025 22:09:41 +0000 (0:00:01.816) 0:00:01.997 ******** 2025-03-22 22:09:49.320803 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:49.321532 | orchestrator | 2025-03-22 22:09:49.322165 | orchestrator | TASK [osism.services.wireguard : Create public and private key - server] ******* 2025-03-22 22:09:49.324001 | orchestrator | Saturday 22 March 2025 22:09:49 +0000 (0:00:07.346) 0:00:09.344 ******** 2025-03-22 22:09:49.926776 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:49.927342 | orchestrator | 2025-03-22 22:09:49.928864 | orchestrator | TASK [osism.services.wireguard : Create preshared key] ************************* 2025-03-22 22:09:49.929328 | orchestrator | Saturday 22 March 2025 22:09:49 +0000 (0:00:00.606) 0:00:09.951 ******** 2025-03-22 22:09:50.427347 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:50.427540 | orchestrator | 2025-03-22 22:09:50.427571 | orchestrator | TASK [osism.services.wireguard : Get preshared key] **************************** 2025-03-22 22:09:50.428203 | orchestrator | Saturday 22 March 2025 22:09:50 +0000 (0:00:00.498) 0:00:10.450 ******** 2025-03-22 22:09:51.023465 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:51.026006 | orchestrator | 2025-03-22 22:09:51.634178 | orchestrator | TASK [osism.services.wireguard : Get public key - server] ********************** 2025-03-22 22:09:51.634347 | orchestrator | Saturday 22 March 2025 22:09:51 +0000 (0:00:00.596) 0:00:11.046 ******** 2025-03-22 22:09:51.634385 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:51.635250 | orchestrator | 2025-03-22 22:09:51.635287 | orchestrator | TASK [osism.services.wireguard : Get private key - server] ********************* 2025-03-22 22:09:51.635590 | orchestrator | Saturday 22 March 2025 22:09:51 +0000 (0:00:00.611) 0:00:11.657 ******** 2025-03-22 22:09:52.074525 | orchestrator | ok: [testbed-manager] 2025-03-22 22:09:52.075416 | orchestrator | 2025-03-22 22:09:52.076344 | orchestrator | TASK [osism.services.wireguard : Copy wg0.conf configuration file] ************* 2025-03-22 22:09:52.077594 | orchestrator | Saturday 22 March 2025 22:09:52 +0000 (0:00:00.440) 0:00:12.098 ******** 2025-03-22 22:09:53.449423 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:53.449562 | orchestrator | 2025-03-22 22:09:53.449589 | orchestrator | TASK [osism.services.wireguard : Copy client configuration files] ************** 2025-03-22 22:09:53.450784 | orchestrator | Saturday 22 March 2025 22:09:53 +0000 (0:00:01.374) 0:00:13.472 ******** 2025-03-22 22:09:54.418364 | orchestrator | changed: [testbed-manager] => (item=None) 2025-03-22 22:09:54.419127 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:54.419487 | orchestrator | 2025-03-22 22:09:54.419991 | orchestrator | TASK [osism.services.wireguard : Manage wg-quick@wg0.service service] ********** 2025-03-22 22:09:54.420593 | orchestrator | Saturday 22 March 2025 22:09:54 +0000 (0:00:00.970) 0:00:14.443 ******** 2025-03-22 22:09:56.280692 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:56.281216 | orchestrator | 2025-03-22 22:09:56.281278 | orchestrator | RUNNING HANDLER [osism.services.wireguard : Restart wg0 service] *************** 2025-03-22 22:09:56.282997 | orchestrator | Saturday 22 March 2025 22:09:56 +0000 (0:00:01.860) 0:00:16.304 ******** 2025-03-22 22:09:57.254552 | orchestrator | changed: [testbed-manager] 2025-03-22 22:09:57.254953 | orchestrator | 2025-03-22 22:09:57.257154 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:09:57.257737 | orchestrator | 2025-03-22 22:09:57 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:09:57.257845 | orchestrator | 2025-03-22 22:09:57 | INFO  | Please wait and do not abort execution. 2025-03-22 22:09:57.259284 | orchestrator | testbed-manager : ok=11  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:09:57.260402 | orchestrator | 2025-03-22 22:09:57.260842 | orchestrator | Saturday 22 March 2025 22:09:57 +0000 (0:00:00.973) 0:00:17.277 ******** 2025-03-22 22:09:57.261877 | orchestrator | =============================================================================== 2025-03-22 22:09:57.262783 | orchestrator | osism.services.wireguard : Install wireguard package -------------------- 7.35s 2025-03-22 22:09:57.264667 | orchestrator | osism.services.wireguard : Manage wg-quick@wg0.service service ---------- 1.86s 2025-03-22 22:09:57.265415 | orchestrator | osism.services.wireguard : Install iptables package --------------------- 1.82s 2025-03-22 22:09:57.266702 | orchestrator | osism.services.wireguard : Copy wg0.conf configuration file ------------- 1.37s 2025-03-22 22:09:57.267942 | orchestrator | osism.services.wireguard : Restart wg0 service -------------------------- 0.97s 2025-03-22 22:09:57.268983 | orchestrator | osism.services.wireguard : Copy client configuration files -------------- 0.97s 2025-03-22 22:09:57.269361 | orchestrator | osism.services.wireguard : Get public key - server ---------------------- 0.61s 2025-03-22 22:09:57.271654 | orchestrator | osism.services.wireguard : Create public and private key - server ------- 0.61s 2025-03-22 22:09:57.272587 | orchestrator | osism.services.wireguard : Get preshared key ---------------------------- 0.60s 2025-03-22 22:09:57.273123 | orchestrator | osism.services.wireguard : Create preshared key ------------------------- 0.50s 2025-03-22 22:09:57.274514 | orchestrator | osism.services.wireguard : Get private key - server --------------------- 0.44s 2025-03-22 22:09:57.911001 | orchestrator | + sh -c /opt/configuration/scripts/prepare-wireguard-configuration.sh 2025-03-22 22:09:57.951125 | orchestrator | % Total % Received % Xferd Average Speed Time Time Time Current 2025-03-22 22:09:58.027075 | orchestrator | Dload Upload Total Spent Left Speed 2025-03-22 22:09:58.027165 | orchestrator | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 14 100 14 0 0 183 0 --:--:-- --:--:-- --:--:-- 184 2025-03-22 22:09:58.043481 | orchestrator | + osism apply --environment custom workarounds 2025-03-22 22:09:59.631809 | orchestrator | 2025-03-22 22:09:59 | INFO  | Trying to run play workarounds in environment custom 2025-03-22 22:09:59.704045 | orchestrator | 2025-03-22 22:09:59 | INFO  | Task 44623117-86cc-4ff8-befc-62741a4103c0 (workarounds) was prepared for execution. 2025-03-22 22:10:03.124248 | orchestrator | 2025-03-22 22:09:59 | INFO  | It takes a moment until task 44623117-86cc-4ff8-befc-62741a4103c0 (workarounds) has been started and output is visible here. 2025-03-22 22:10:03.124356 | orchestrator | 2025-03-22 22:10:03.125351 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 22:10:03.126722 | orchestrator | 2025-03-22 22:10:03.128885 | orchestrator | TASK [Group hosts based on virtualization_role] ******************************** 2025-03-22 22:10:03.130144 | orchestrator | Saturday 22 March 2025 22:10:03 +0000 (0:00:00.165) 0:00:00.165 ******** 2025-03-22 22:10:03.317718 | orchestrator | changed: [testbed-manager] => (item=virtualization_role_guest) 2025-03-22 22:10:03.416624 | orchestrator | changed: [testbed-node-3] => (item=virtualization_role_guest) 2025-03-22 22:10:03.521082 | orchestrator | changed: [testbed-node-4] => (item=virtualization_role_guest) 2025-03-22 22:10:03.635510 | orchestrator | changed: [testbed-node-5] => (item=virtualization_role_guest) 2025-03-22 22:10:03.722918 | orchestrator | changed: [testbed-node-0] => (item=virtualization_role_guest) 2025-03-22 22:10:04.033822 | orchestrator | changed: [testbed-node-1] => (item=virtualization_role_guest) 2025-03-22 22:10:04.034977 | orchestrator | changed: [testbed-node-2] => (item=virtualization_role_guest) 2025-03-22 22:10:04.035436 | orchestrator | 2025-03-22 22:10:04.036371 | orchestrator | PLAY [Apply netplan configuration on the manager node] ************************* 2025-03-22 22:10:04.036952 | orchestrator | 2025-03-22 22:10:04.037698 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-03-22 22:10:04.038077 | orchestrator | Saturday 22 March 2025 22:10:04 +0000 (0:00:00.909) 0:00:01.074 ******** 2025-03-22 22:10:06.929968 | orchestrator | ok: [testbed-manager] 2025-03-22 22:10:06.932696 | orchestrator | 2025-03-22 22:10:09.012854 | orchestrator | PLAY [Apply netplan configuration on all other nodes] ************************** 2025-03-22 22:10:09.013024 | orchestrator | 2025-03-22 22:10:09.013045 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-03-22 22:10:09.013061 | orchestrator | Saturday 22 March 2025 22:10:06 +0000 (0:00:02.892) 0:00:03.967 ******** 2025-03-22 22:10:09.013090 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:10:09.013171 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:10:09.013722 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:10:09.014394 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:10:09.015573 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:10:09.015978 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:10:09.019567 | orchestrator | 2025-03-22 22:10:10.661205 | orchestrator | PLAY [Add custom CA certificates to non-manager nodes] ************************* 2025-03-22 22:10:10.661276 | orchestrator | 2025-03-22 22:10:10.661287 | orchestrator | TASK [Copy custom CA certificates] ********************************************* 2025-03-22 22:10:10.661295 | orchestrator | Saturday 22 March 2025 22:10:09 +0000 (0:00:02.084) 0:00:06.051 ******** 2025-03-22 22:10:10.661311 | orchestrator | changed: [testbed-node-5] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-22 22:10:10.661587 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-22 22:10:10.661746 | orchestrator | changed: [testbed-node-3] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-22 22:10:10.662135 | orchestrator | changed: [testbed-node-4] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-22 22:10:10.663032 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-22 22:10:10.663454 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-03-22 22:10:10.663485 | orchestrator | 2025-03-22 22:10:10.664998 | orchestrator | TASK [Run update-ca-certificates] ********************************************** 2025-03-22 22:10:10.665426 | orchestrator | Saturday 22 March 2025 22:10:10 +0000 (0:00:01.649) 0:00:07.700 ******** 2025-03-22 22:10:13.844925 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:10:13.845069 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:10:13.845769 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:10:13.846403 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:10:13.847599 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:10:13.848246 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:10:13.849016 | orchestrator | 2025-03-22 22:10:13.849645 | orchestrator | TASK [Run update-ca-trust] ***************************************************** 2025-03-22 22:10:13.850176 | orchestrator | Saturday 22 March 2025 22:10:13 +0000 (0:00:03.185) 0:00:10.886 ******** 2025-03-22 22:10:14.013815 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:10:14.115018 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:10:14.217246 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:10:14.538963 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:10:14.718636 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:10:14.719276 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:10:14.719721 | orchestrator | 2025-03-22 22:10:14.720335 | orchestrator | PLAY [Add a workaround service] ************************************************ 2025-03-22 22:10:14.720834 | orchestrator | 2025-03-22 22:10:14.722143 | orchestrator | TASK [Copy workarounds.sh scripts] ********************************************* 2025-03-22 22:10:14.722398 | orchestrator | Saturday 22 March 2025 22:10:14 +0000 (0:00:00.873) 0:00:11.759 ******** 2025-03-22 22:10:16.532139 | orchestrator | changed: [testbed-manager] 2025-03-22 22:10:16.532462 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:10:16.533298 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:10:16.536831 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:10:16.536948 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:10:16.536961 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:10:16.538136 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:10:16.539770 | orchestrator | 2025-03-22 22:10:16.540101 | orchestrator | TASK [Copy workarounds systemd unit file] ************************************** 2025-03-22 22:10:16.542216 | orchestrator | Saturday 22 March 2025 22:10:16 +0000 (0:00:01.814) 0:00:13.573 ******** 2025-03-22 22:10:18.333470 | orchestrator | changed: [testbed-manager] 2025-03-22 22:10:18.334126 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:10:18.335305 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:10:18.336279 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:10:18.337049 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:10:18.338402 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:10:18.339912 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:10:18.340355 | orchestrator | 2025-03-22 22:10:18.344502 | orchestrator | TASK [Reload systemd daemon] *************************************************** 2025-03-22 22:10:20.043737 | orchestrator | Saturday 22 March 2025 22:10:18 +0000 (0:00:01.796) 0:00:15.370 ******** 2025-03-22 22:10:20.043870 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:10:20.046922 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:10:20.047352 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:10:20.048781 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:10:20.051416 | orchestrator | ok: [testbed-manager] 2025-03-22 22:10:20.052468 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:10:20.053488 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:10:20.054098 | orchestrator | 2025-03-22 22:10:20.055079 | orchestrator | TASK [Enable workarounds.service (Debian)] ************************************* 2025-03-22 22:10:20.056195 | orchestrator | Saturday 22 March 2025 22:10:20 +0000 (0:00:01.713) 0:00:17.083 ******** 2025-03-22 22:10:22.045144 | orchestrator | changed: [testbed-manager] 2025-03-22 22:10:22.047477 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:10:22.048095 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:10:22.048343 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:10:22.049919 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:10:22.050526 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:10:22.053180 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:10:22.053975 | orchestrator | 2025-03-22 22:10:22.054625 | orchestrator | TASK [Enable and start workarounds.service (RedHat)] *************************** 2025-03-22 22:10:22.054984 | orchestrator | Saturday 22 March 2025 22:10:22 +0000 (0:00:02.001) 0:00:19.085 ******** 2025-03-22 22:10:22.231747 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:10:22.337261 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:10:22.418157 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:10:22.513513 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:10:22.785346 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:10:22.958070 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:10:22.959325 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:10:22.960368 | orchestrator | 2025-03-22 22:10:22.961989 | orchestrator | PLAY [On Ubuntu 24.04 install python3-docker from Debian Sid] ****************** 2025-03-22 22:10:22.963528 | orchestrator | 2025-03-22 22:10:22.964665 | orchestrator | TASK [Install python3-docker] ************************************************** 2025-03-22 22:10:22.965444 | orchestrator | Saturday 22 March 2025 22:10:22 +0000 (0:00:00.914) 0:00:20.000 ******** 2025-03-22 22:10:25.629098 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:10:25.629324 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:10:25.629360 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:10:25.631310 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:10:25.632086 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:10:25.633975 | orchestrator | ok: [testbed-manager] 2025-03-22 22:10:25.634884 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:10:25.635598 | orchestrator | 2025-03-22 22:10:25.636614 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:10:25.636894 | orchestrator | 2025-03-22 22:10:25 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:10:25.636963 | orchestrator | 2025-03-22 22:10:25 | INFO  | Please wait and do not abort execution. 2025-03-22 22:10:25.637884 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:10:25.638554 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:25.638896 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:25.639287 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:25.639655 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:25.639865 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:25.640246 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:25.640634 | orchestrator | 2025-03-22 22:10:25.641096 | orchestrator | Saturday 22 March 2025 22:10:25 +0000 (0:00:02.668) 0:00:22.669 ******** 2025-03-22 22:10:25.641249 | orchestrator | =============================================================================== 2025-03-22 22:10:25.641676 | orchestrator | Run update-ca-certificates ---------------------------------------------- 3.19s 2025-03-22 22:10:25.641870 | orchestrator | Apply netplan configuration --------------------------------------------- 2.89s 2025-03-22 22:10:25.642208 | orchestrator | Install python3-docker -------------------------------------------------- 2.67s 2025-03-22 22:10:25.642456 | orchestrator | Apply netplan configuration --------------------------------------------- 2.08s 2025-03-22 22:10:25.642892 | orchestrator | Enable workarounds.service (Debian) ------------------------------------- 2.00s 2025-03-22 22:10:25.643296 | orchestrator | Copy workarounds.sh scripts --------------------------------------------- 1.81s 2025-03-22 22:10:25.643393 | orchestrator | Copy workarounds systemd unit file -------------------------------------- 1.80s 2025-03-22 22:10:25.644088 | orchestrator | Reload systemd daemon --------------------------------------------------- 1.71s 2025-03-22 22:10:25.644680 | orchestrator | Copy custom CA certificates --------------------------------------------- 1.65s 2025-03-22 22:10:25.644871 | orchestrator | Enable and start workarounds.service (RedHat) --------------------------- 0.91s 2025-03-22 22:10:25.645500 | orchestrator | Group hosts based on virtualization_role -------------------------------- 0.91s 2025-03-22 22:10:25.645585 | orchestrator | Run update-ca-trust ----------------------------------------------------- 0.87s 2025-03-22 22:10:26.354568 | orchestrator | + osism apply reboot -l testbed-nodes -e ireallymeanit=yes 2025-03-22 22:10:28.060744 | orchestrator | 2025-03-22 22:10:28 | INFO  | Task e56bc070-a301-493a-8ae2-ad64d4798742 (reboot) was prepared for execution. 2025-03-22 22:10:31.761837 | orchestrator | 2025-03-22 22:10:28 | INFO  | It takes a moment until task e56bc070-a301-493a-8ae2-ad64d4798742 (reboot) has been started and output is visible here. 2025-03-22 22:10:31.762072 | orchestrator | 2025-03-22 22:10:31.762174 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-22 22:10:31.762474 | orchestrator | 2025-03-22 22:10:31.762503 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-22 22:10:31.762526 | orchestrator | Saturday 22 March 2025 22:10:31 +0000 (0:00:00.183) 0:00:00.183 ******** 2025-03-22 22:10:31.856905 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:10:31.857029 | orchestrator | 2025-03-22 22:10:31.858965 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-22 22:10:31.859000 | orchestrator | Saturday 22 March 2025 22:10:31 +0000 (0:00:00.097) 0:00:00.281 ******** 2025-03-22 22:10:32.802305 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:10:32.802930 | orchestrator | 2025-03-22 22:10:32.803968 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-22 22:10:32.804904 | orchestrator | Saturday 22 March 2025 22:10:32 +0000 (0:00:00.943) 0:00:01.225 ******** 2025-03-22 22:10:32.912642 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:10:32.913413 | orchestrator | 2025-03-22 22:10:32.917915 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-22 22:10:32.918874 | orchestrator | 2025-03-22 22:10:32.919898 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-22 22:10:32.920669 | orchestrator | Saturday 22 March 2025 22:10:32 +0000 (0:00:00.110) 0:00:01.335 ******** 2025-03-22 22:10:33.018189 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:10:33.020759 | orchestrator | 2025-03-22 22:10:33.021089 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-22 22:10:33.021123 | orchestrator | Saturday 22 March 2025 22:10:33 +0000 (0:00:00.104) 0:00:01.440 ******** 2025-03-22 22:10:33.748497 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:10:33.748970 | orchestrator | 2025-03-22 22:10:33.750067 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-22 22:10:33.751203 | orchestrator | Saturday 22 March 2025 22:10:33 +0000 (0:00:00.731) 0:00:02.172 ******** 2025-03-22 22:10:33.876596 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:10:33.877528 | orchestrator | 2025-03-22 22:10:33.878484 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-22 22:10:33.878904 | orchestrator | 2025-03-22 22:10:33.880488 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-22 22:10:34.000670 | orchestrator | Saturday 22 March 2025 22:10:33 +0000 (0:00:00.124) 0:00:02.297 ******** 2025-03-22 22:10:34.000753 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:10:34.001128 | orchestrator | 2025-03-22 22:10:34.002194 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-22 22:10:34.005749 | orchestrator | Saturday 22 March 2025 22:10:33 +0000 (0:00:00.127) 0:00:02.425 ******** 2025-03-22 22:10:34.809634 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:10:34.810398 | orchestrator | 2025-03-22 22:10:34.810441 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-22 22:10:34.810765 | orchestrator | Saturday 22 March 2025 22:10:34 +0000 (0:00:00.806) 0:00:03.232 ******** 2025-03-22 22:10:34.943406 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:10:34.943523 | orchestrator | 2025-03-22 22:10:34.943548 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-22 22:10:34.944242 | orchestrator | 2025-03-22 22:10:34.944469 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-22 22:10:34.945599 | orchestrator | Saturday 22 March 2025 22:10:34 +0000 (0:00:00.137) 0:00:03.369 ******** 2025-03-22 22:10:35.049602 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:10:35.049700 | orchestrator | 2025-03-22 22:10:35.049722 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-22 22:10:35.049866 | orchestrator | Saturday 22 March 2025 22:10:35 +0000 (0:00:00.104) 0:00:03.473 ******** 2025-03-22 22:10:35.710249 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:10:35.711062 | orchestrator | 2025-03-22 22:10:35.711733 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-22 22:10:35.713152 | orchestrator | Saturday 22 March 2025 22:10:35 +0000 (0:00:00.658) 0:00:04.132 ******** 2025-03-22 22:10:35.842246 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:10:35.842881 | orchestrator | 2025-03-22 22:10:35.844420 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-22 22:10:35.845027 | orchestrator | 2025-03-22 22:10:35.845045 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-22 22:10:35.845849 | orchestrator | Saturday 22 March 2025 22:10:35 +0000 (0:00:00.129) 0:00:04.262 ******** 2025-03-22 22:10:35.955611 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:10:35.956661 | orchestrator | 2025-03-22 22:10:35.958261 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-22 22:10:35.959521 | orchestrator | Saturday 22 March 2025 22:10:35 +0000 (0:00:00.117) 0:00:04.380 ******** 2025-03-22 22:10:36.563505 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:10:36.563836 | orchestrator | 2025-03-22 22:10:36.564931 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-22 22:10:36.566869 | orchestrator | Saturday 22 March 2025 22:10:36 +0000 (0:00:00.607) 0:00:04.987 ******** 2025-03-22 22:10:36.683329 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:10:36.683762 | orchestrator | 2025-03-22 22:10:36.685178 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-03-22 22:10:36.686064 | orchestrator | 2025-03-22 22:10:36.688018 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-03-22 22:10:36.776253 | orchestrator | Saturday 22 March 2025 22:10:36 +0000 (0:00:00.116) 0:00:05.104 ******** 2025-03-22 22:10:36.776375 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:10:36.776443 | orchestrator | 2025-03-22 22:10:36.777001 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-03-22 22:10:36.777539 | orchestrator | Saturday 22 March 2025 22:10:36 +0000 (0:00:00.096) 0:00:05.201 ******** 2025-03-22 22:10:37.471473 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:10:37.472211 | orchestrator | 2025-03-22 22:10:37.473899 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-03-22 22:10:37.474840 | orchestrator | Saturday 22 March 2025 22:10:37 +0000 (0:00:00.694) 0:00:05.895 ******** 2025-03-22 22:10:37.511030 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:10:37.511553 | orchestrator | 2025-03-22 22:10:37.512713 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:10:37.513781 | orchestrator | 2025-03-22 22:10:37 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:10:37.514342 | orchestrator | 2025-03-22 22:10:37 | INFO  | Please wait and do not abort execution. 2025-03-22 22:10:37.514393 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:37.517598 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:37.518883 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:37.518918 | orchestrator | testbed-node-3 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:37.519256 | orchestrator | testbed-node-4 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:37.519282 | orchestrator | testbed-node-5 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:10:37.519304 | orchestrator | 2025-03-22 22:10:37.519985 | orchestrator | Saturday 22 March 2025 22:10:37 +0000 (0:00:00.040) 0:00:05.936 ******** 2025-03-22 22:10:37.520929 | orchestrator | =============================================================================== 2025-03-22 22:10:37.521820 | orchestrator | Reboot system - do not wait for the reboot to complete ------------------ 4.44s 2025-03-22 22:10:37.522384 | orchestrator | Reboot system - wait for the reboot to complete ------------------------- 0.66s 2025-03-22 22:10:37.522524 | orchestrator | Exit playbook, if user did not mean to reboot systems ------------------- 0.65s 2025-03-22 22:10:38.187378 | orchestrator | + osism apply wait-for-connection -l testbed-nodes -e ireallymeanit=yes 2025-03-22 22:10:39.853082 | orchestrator | 2025-03-22 22:10:39 | INFO  | Task ecbe5f30-3b4b-4063-967c-f25faf5b1da2 (wait-for-connection) was prepared for execution. 2025-03-22 22:10:43.605866 | orchestrator | 2025-03-22 22:10:39 | INFO  | It takes a moment until task ecbe5f30-3b4b-4063-967c-f25faf5b1da2 (wait-for-connection) has been started and output is visible here. 2025-03-22 22:10:43.605975 | orchestrator | 2025-03-22 22:10:43.606578 | orchestrator | PLAY [Wait until remote systems are reachable] ********************************* 2025-03-22 22:10:43.606872 | orchestrator | 2025-03-22 22:10:43.607719 | orchestrator | TASK [Wait until remote system is reachable] *********************************** 2025-03-22 22:10:43.608395 | orchestrator | Saturday 22 March 2025 22:10:43 +0000 (0:00:00.245) 0:00:00.246 ******** 2025-03-22 22:10:55.698488 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:10:55.698617 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:10:55.698637 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:10:55.698652 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:10:55.698666 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:10:55.698680 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:10:55.698694 | orchestrator | 2025-03-22 22:10:55.698709 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:10:55.698740 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:10:55.698762 | orchestrator | 2025-03-22 22:10:55 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:10:55.702159 | orchestrator | 2025-03-22 22:10:55 | INFO  | Please wait and do not abort execution. 2025-03-22 22:10:55.702197 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:10:56.349612 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:10:56.349712 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:10:56.349728 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:10:56.349743 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:10:56.349757 | orchestrator | 2025-03-22 22:10:56.349772 | orchestrator | Saturday 22 March 2025 22:10:55 +0000 (0:00:12.085) 0:00:12.331 ******** 2025-03-22 22:10:56.349787 | orchestrator | =============================================================================== 2025-03-22 22:10:56.349801 | orchestrator | Wait until remote system is reachable ---------------------------------- 12.09s 2025-03-22 22:10:56.349836 | orchestrator | + osism apply hddtemp 2025-03-22 22:10:58.031378 | orchestrator | 2025-03-22 22:10:58 | INFO  | Task bcc81d41-97b4-45c4-8b0e-96bfa5617491 (hddtemp) was prepared for execution. 2025-03-22 22:11:01.969156 | orchestrator | 2025-03-22 22:10:58 | INFO  | It takes a moment until task bcc81d41-97b4-45c4-8b0e-96bfa5617491 (hddtemp) has been started and output is visible here. 2025-03-22 22:11:01.969292 | orchestrator | 2025-03-22 22:11:01.972617 | orchestrator | PLAY [Apply role hddtemp] ****************************************************** 2025-03-22 22:11:01.972646 | orchestrator | 2025-03-22 22:11:01.972937 | orchestrator | TASK [osism.services.hddtemp : Gather variables for each operating system] ***** 2025-03-22 22:11:01.972960 | orchestrator | Saturday 22 March 2025 22:11:01 +0000 (0:00:00.289) 0:00:00.289 ******** 2025-03-22 22:11:02.132143 | orchestrator | ok: [testbed-manager] 2025-03-22 22:11:02.217202 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:11:02.322962 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:11:02.419350 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:11:02.529554 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:11:02.800258 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:11:02.800664 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:11:02.801235 | orchestrator | 2025-03-22 22:11:02.801862 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific install tasks] **** 2025-03-22 22:11:02.802486 | orchestrator | Saturday 22 March 2025 22:11:02 +0000 (0:00:00.831) 0:00:01.121 ******** 2025-03-22 22:11:04.194665 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 22:11:04.194842 | orchestrator | 2025-03-22 22:11:04.195536 | orchestrator | TASK [osism.services.hddtemp : Remove hddtemp package] ************************* 2025-03-22 22:11:04.195757 | orchestrator | Saturday 22 March 2025 22:11:04 +0000 (0:00:01.390) 0:00:02.512 ******** 2025-03-22 22:11:06.426569 | orchestrator | ok: [testbed-manager] 2025-03-22 22:11:06.426780 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:11:06.427608 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:11:06.428111 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:11:06.428671 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:11:06.430512 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:11:06.431037 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:11:06.431067 | orchestrator | 2025-03-22 22:11:06.431773 | orchestrator | TASK [osism.services.hddtemp : Enable Kernel Module drivetemp] ***************** 2025-03-22 22:11:06.432245 | orchestrator | Saturday 22 March 2025 22:11:06 +0000 (0:00:02.237) 0:00:04.749 ******** 2025-03-22 22:11:07.035973 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:11:07.145625 | orchestrator | changed: [testbed-manager] 2025-03-22 22:11:07.730132 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:11:07.730364 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:11:07.730903 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:11:07.731530 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:11:07.732122 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:11:07.734694 | orchestrator | 2025-03-22 22:11:09.213019 | orchestrator | TASK [osism.services.hddtemp : Check if drivetemp module is available] ********* 2025-03-22 22:11:09.213168 | orchestrator | Saturday 22 March 2025 22:11:07 +0000 (0:00:01.298) 0:00:06.048 ******** 2025-03-22 22:11:09.213246 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:11:09.213438 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:11:09.213469 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:11:09.214089 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:11:09.214568 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:11:09.215028 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:11:09.215563 | orchestrator | ok: [testbed-manager] 2025-03-22 22:11:09.215790 | orchestrator | 2025-03-22 22:11:09.216738 | orchestrator | TASK [osism.services.hddtemp : Load Kernel Module drivetemp] ******************* 2025-03-22 22:11:09.217857 | orchestrator | Saturday 22 March 2025 22:11:09 +0000 (0:00:01.483) 0:00:07.532 ******** 2025-03-22 22:11:09.514290 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:11:09.615073 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:11:09.708265 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:11:09.814979 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:11:09.973699 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:11:09.974075 | orchestrator | changed: [testbed-manager] 2025-03-22 22:11:09.974094 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:11:09.974106 | orchestrator | 2025-03-22 22:11:09.974362 | orchestrator | TASK [osism.services.hddtemp : Install lm-sensors] ***************************** 2025-03-22 22:11:09.974973 | orchestrator | Saturday 22 March 2025 22:11:09 +0000 (0:00:00.763) 0:00:08.295 ******** 2025-03-22 22:11:24.382647 | orchestrator | changed: [testbed-manager] 2025-03-22 22:11:24.384362 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:11:24.384393 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:11:24.384407 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:11:24.384429 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:11:24.384800 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:11:24.385315 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:11:24.385968 | orchestrator | 2025-03-22 22:11:24.386294 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific service tasks] **** 2025-03-22 22:11:24.386734 | orchestrator | Saturday 22 March 2025 22:11:24 +0000 (0:00:14.401) 0:00:22.697 ******** 2025-03-22 22:11:25.846868 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/service-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 22:11:25.847549 | orchestrator | 2025-03-22 22:11:25.847591 | orchestrator | TASK [osism.services.hddtemp : Manage lm-sensors service] ********************** 2025-03-22 22:11:25.848033 | orchestrator | Saturday 22 March 2025 22:11:25 +0000 (0:00:01.468) 0:00:24.166 ******** 2025-03-22 22:11:28.082187 | orchestrator | changed: [testbed-manager] 2025-03-22 22:11:28.082696 | orchestrator | changed: [testbed-node-1] 2025-03-22 22:11:28.083979 | orchestrator | changed: [testbed-node-2] 2025-03-22 22:11:28.084811 | orchestrator | changed: [testbed-node-0] 2025-03-22 22:11:28.088814 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:11:28.089995 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:11:28.091442 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:11:28.092090 | orchestrator | 2025-03-22 22:11:28.093327 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:11:28.093391 | orchestrator | 2025-03-22 22:11:28 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:11:28.094391 | orchestrator | 2025-03-22 22:11:28 | INFO  | Please wait and do not abort execution. 2025-03-22 22:11:28.094447 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:11:28.095558 | orchestrator | testbed-node-0 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:28.095906 | orchestrator | testbed-node-1 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:28.097124 | orchestrator | testbed-node-2 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:28.097325 | orchestrator | testbed-node-3 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:28.098436 | orchestrator | testbed-node-4 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:28.098923 | orchestrator | testbed-node-5 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:28.099746 | orchestrator | 2025-03-22 22:11:28.100716 | orchestrator | Saturday 22 March 2025 22:11:28 +0000 (0:00:02.238) 0:00:26.404 ******** 2025-03-22 22:11:28.101293 | orchestrator | =============================================================================== 2025-03-22 22:11:28.101716 | orchestrator | osism.services.hddtemp : Install lm-sensors ---------------------------- 14.40s 2025-03-22 22:11:28.102574 | orchestrator | osism.services.hddtemp : Manage lm-sensors service ---------------------- 2.24s 2025-03-22 22:11:28.102840 | orchestrator | osism.services.hddtemp : Remove hddtemp package ------------------------- 2.24s 2025-03-22 22:11:28.103876 | orchestrator | osism.services.hddtemp : Check if drivetemp module is available --------- 1.48s 2025-03-22 22:11:28.104170 | orchestrator | osism.services.hddtemp : Include distribution specific service tasks ---- 1.47s 2025-03-22 22:11:28.104747 | orchestrator | osism.services.hddtemp : Include distribution specific install tasks ---- 1.39s 2025-03-22 22:11:28.105299 | orchestrator | osism.services.hddtemp : Enable Kernel Module drivetemp ----------------- 1.30s 2025-03-22 22:11:28.105590 | orchestrator | osism.services.hddtemp : Gather variables for each operating system ----- 0.83s 2025-03-22 22:11:28.106146 | orchestrator | osism.services.hddtemp : Load Kernel Module drivetemp ------------------- 0.76s 2025-03-22 22:11:28.828729 | orchestrator | + sudo systemctl restart docker-compose@manager 2025-03-22 22:11:30.090592 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-03-22 22:11:30.118626 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-03-22 22:11:30.118678 | orchestrator | + local max_attempts=60 2025-03-22 22:11:30.118695 | orchestrator | + local name=ceph-ansible 2025-03-22 22:11:30.118711 | orchestrator | + local attempt_num=1 2025-03-22 22:11:30.118726 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-03-22 22:11:30.118752 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-22 22:11:30.118813 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-03-22 22:11:30.118831 | orchestrator | + local max_attempts=60 2025-03-22 22:11:30.118845 | orchestrator | + local name=kolla-ansible 2025-03-22 22:11:30.118860 | orchestrator | + local attempt_num=1 2025-03-22 22:11:30.118878 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-03-22 22:11:30.146375 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-22 22:11:30.176314 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-03-22 22:11:30.176371 | orchestrator | + local max_attempts=60 2025-03-22 22:11:30.176390 | orchestrator | + local name=osism-ansible 2025-03-22 22:11:30.176405 | orchestrator | + local attempt_num=1 2025-03-22 22:11:30.176421 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-03-22 22:11:30.176447 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-03-22 22:11:30.562695 | orchestrator | + [[ true == \t\r\u\e ]] 2025-03-22 22:11:30.562787 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-03-22 22:11:30.562817 | orchestrator | ARA in ceph-ansible already disabled. 2025-03-22 22:11:30.909846 | orchestrator | ARA in kolla-ansible already disabled. 2025-03-22 22:11:31.269749 | orchestrator | ARA in osism-ansible already disabled. 2025-03-22 22:11:31.597674 | orchestrator | ARA in osism-kubernetes already disabled. 2025-03-22 22:11:31.598073 | orchestrator | + osism apply gather-facts 2025-03-22 22:11:33.365698 | orchestrator | 2025-03-22 22:11:33 | INFO  | Task a24a5425-7a07-49ae-893c-1399241d9f42 (gather-facts) was prepared for execution. 2025-03-22 22:11:37.305600 | orchestrator | 2025-03-22 22:11:33 | INFO  | It takes a moment until task a24a5425-7a07-49ae-893c-1399241d9f42 (gather-facts) has been started and output is visible here. 2025-03-22 22:11:37.305730 | orchestrator | 2025-03-22 22:11:37.305850 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-22 22:11:37.307698 | orchestrator | 2025-03-22 22:11:37.313370 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-22 22:11:42.387750 | orchestrator | Saturday 22 March 2025 22:11:37 +0000 (0:00:00.179) 0:00:00.179 ******** 2025-03-22 22:11:42.387927 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:11:42.388004 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:11:42.388027 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:11:42.388288 | orchestrator | ok: [testbed-manager] 2025-03-22 22:11:42.388634 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:11:42.389462 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:11:42.390500 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:11:42.393563 | orchestrator | 2025-03-22 22:11:42.393597 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-03-22 22:11:42.581281 | orchestrator | 2025-03-22 22:11:42.581387 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-03-22 22:11:42.581404 | orchestrator | Saturday 22 March 2025 22:11:42 +0000 (0:00:05.083) 0:00:05.263 ******** 2025-03-22 22:11:42.581433 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:11:42.680478 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:11:42.775010 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:11:42.867626 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:11:42.952422 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:11:42.989686 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:11:42.990107 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:11:42.990508 | orchestrator | 2025-03-22 22:11:42.991040 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:11:42.991322 | orchestrator | 2025-03-22 22:11:42 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:11:42.992028 | orchestrator | 2025-03-22 22:11:42 | INFO  | Please wait and do not abort execution. 2025-03-22 22:11:42.992067 | orchestrator | testbed-manager : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:42.993050 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:42.993559 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:42.993594 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:42.993829 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:42.993917 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:42.994144 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-03-22 22:11:42.994437 | orchestrator | 2025-03-22 22:11:42.994684 | orchestrator | Saturday 22 March 2025 22:11:42 +0000 (0:00:00.604) 0:00:05.867 ******** 2025-03-22 22:11:42.995239 | orchestrator | =============================================================================== 2025-03-22 22:11:42.995329 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.08s 2025-03-22 22:11:43.775155 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.60s 2025-03-22 22:11:43.775328 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/001-helpers.sh /usr/local/bin/deploy-helper 2025-03-22 22:11:43.789516 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/500-kubernetes.sh /usr/local/bin/deploy-kubernetes 2025-03-22 22:11:43.803344 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/510-clusterapi.sh /usr/local/bin/deploy-kubernetes-clusterapi 2025-03-22 22:11:43.819534 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-ansible.sh /usr/local/bin/deploy-ceph-with-ansible 2025-03-22 22:11:43.831975 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-rook.sh /usr/local/bin/deploy-ceph-with-rook 2025-03-22 22:11:43.846161 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/200-infrastructure.sh /usr/local/bin/deploy-infrastructure 2025-03-22 22:11:43.864453 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/300-openstack.sh /usr/local/bin/deploy-openstack 2025-03-22 22:11:43.881177 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/400-monitoring.sh /usr/local/bin/deploy-monitoring 2025-03-22 22:11:43.895415 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/500-kubernetes.sh /usr/local/bin/upgrade-kubernetes 2025-03-22 22:11:43.910107 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/510-clusterapi.sh /usr/local/bin/upgrade-kubernetes-clusterapi 2025-03-22 22:11:43.925732 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-ansible.sh /usr/local/bin/upgrade-ceph-with-ansible 2025-03-22 22:11:43.941271 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-rook.sh /usr/local/bin/upgrade-ceph-with-rook 2025-03-22 22:11:43.953466 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/200-infrastructure.sh /usr/local/bin/upgrade-infrastructure 2025-03-22 22:11:43.965293 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/300-openstack.sh /usr/local/bin/upgrade-openstack 2025-03-22 22:11:43.978316 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/400-monitoring.sh /usr/local/bin/upgrade-monitoring 2025-03-22 22:11:43.992105 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/300-openstack.sh /usr/local/bin/bootstrap-openstack 2025-03-22 22:11:44.011345 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/301-openstack-octavia-amhpora-image.sh /usr/local/bin/bootstrap-octavia 2025-03-22 22:11:44.027264 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/302-openstack-k8s-clusterapi-images.sh /usr/local/bin/bootstrap-clusterapi 2025-03-22 22:11:44.043377 | orchestrator | + sudo ln -sf /opt/configuration/scripts/disable-local-registry.sh /usr/local/bin/disable-local-registry 2025-03-22 22:11:44.059732 | orchestrator | + sudo ln -sf /opt/configuration/scripts/pull-images.sh /usr/local/bin/pull-images 2025-03-22 22:11:44.076457 | orchestrator | + [[ false == \t\r\u\e ]] 2025-03-22 22:11:44.264361 | orchestrator | changed 2025-03-22 22:11:44.328091 | 2025-03-22 22:11:44.328211 | TASK [Deploy services] 2025-03-22 22:11:44.424860 | orchestrator | skipping: Conditional result was False 2025-03-22 22:11:44.435272 | 2025-03-22 22:11:44.435383 | TASK [Deploy in a nutshell] 2025-03-22 22:11:45.136177 | orchestrator | + set -e 2025-03-22 22:11:45.136362 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-03-22 22:11:45.136383 | orchestrator | ++ export INTERACTIVE=false 2025-03-22 22:11:45.136408 | orchestrator | ++ INTERACTIVE=false 2025-03-22 22:11:45.136435 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-03-22 22:11:45.136445 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-03-22 22:11:45.136454 | orchestrator | + source /opt/manager-vars.sh 2025-03-22 22:11:45.136466 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-03-22 22:11:45.136479 | orchestrator | ++ NUMBER_OF_NODES=6 2025-03-22 22:11:45.136488 | orchestrator | ++ export CEPH_VERSION=quincy 2025-03-22 22:11:45.136496 | orchestrator | ++ CEPH_VERSION=quincy 2025-03-22 22:11:45.136503 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-03-22 22:11:45.136512 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-03-22 22:11:45.136520 | orchestrator | ++ export MANAGER_VERSION=8.1.0 2025-03-22 22:11:45.136528 | orchestrator | ++ MANAGER_VERSION=8.1.0 2025-03-22 22:11:45.136536 | orchestrator | ++ export OPENSTACK_VERSION=2024.1 2025-03-22 22:11:45.136544 | orchestrator | ++ OPENSTACK_VERSION=2024.1 2025-03-22 22:11:45.136552 | orchestrator | ++ export ARA=false 2025-03-22 22:11:45.136560 | orchestrator | ++ ARA=false 2025-03-22 22:11:45.136568 | orchestrator | ++ export TEMPEST=false 2025-03-22 22:11:45.136576 | orchestrator | ++ TEMPEST=false 2025-03-22 22:11:45.136584 | orchestrator | ++ export IS_ZUUL=true 2025-03-22 22:11:45.136592 | orchestrator | ++ IS_ZUUL=true 2025-03-22 22:11:45.136600 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.192.83 2025-03-22 22:11:45.136609 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.192.83 2025-03-22 22:11:45.136617 | orchestrator | ++ export EXTERNAL_API=false 2025-03-22 22:11:45.136626 | orchestrator | ++ EXTERNAL_API=false 2025-03-22 22:11:45.136635 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-03-22 22:11:45.136643 | orchestrator | ++ IMAGE_USER=ubuntu 2025-03-22 22:11:45.136652 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-03-22 22:11:45.136661 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-03-22 22:11:45.136670 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-03-22 22:11:45.136682 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-03-22 22:11:45.136694 | orchestrator | + echo 2025-03-22 22:11:45.137094 | orchestrator | 2025-03-22 22:11:45.137749 | orchestrator | # PULL IMAGES 2025-03-22 22:11:45.137779 | orchestrator | 2025-03-22 22:11:45.137787 | orchestrator | + echo '# PULL IMAGES' 2025-03-22 22:11:45.137796 | orchestrator | + echo 2025-03-22 22:11:45.137809 | orchestrator | ++ semver 8.1.0 7.0.0 2025-03-22 22:11:45.184729 | orchestrator | + [[ 1 -ge 0 ]] 2025-03-22 22:11:46.941935 | orchestrator | + osism apply -r 2 -e custom pull-images 2025-03-22 22:11:46.942100 | orchestrator | 2025-03-22 22:11:46 | INFO  | Trying to run play pull-images in environment custom 2025-03-22 22:11:46.997843 | orchestrator | 2025-03-22 22:11:46 | INFO  | Task ce1ab746-0a69-46ba-b1d2-c9d2e5ba2faf (pull-images) was prepared for execution. 2025-03-22 22:11:46.998057 | orchestrator | 2025-03-22 22:11:46 | INFO  | It takes a moment until task ce1ab746-0a69-46ba-b1d2-c9d2e5ba2faf (pull-images) has been started and output is visible here. 2025-03-22 22:11:50.848490 | orchestrator | 2025-03-22 22:11:50.849680 | orchestrator | PLAY [Pull images] ************************************************************* 2025-03-22 22:11:50.849725 | orchestrator | 2025-03-22 22:11:50.850102 | orchestrator | TASK [Pull keystone image] ***************************************************** 2025-03-22 22:11:50.851495 | orchestrator | Saturday 22 March 2025 22:11:50 +0000 (0:00:00.164) 0:00:00.164 ******** 2025-03-22 22:12:30.182492 | orchestrator | changed: [testbed-manager] 2025-03-22 22:13:29.382688 | orchestrator | 2025-03-22 22:13:29.382846 | orchestrator | TASK [Pull other images] ******************************************************* 2025-03-22 22:13:29.382884 | orchestrator | Saturday 22 March 2025 22:12:30 +0000 (0:00:39.330) 0:00:39.495 ******** 2025-03-22 22:13:29.382936 | orchestrator | changed: [testbed-manager] => (item=aodh) 2025-03-22 22:13:29.383114 | orchestrator | changed: [testbed-manager] => (item=barbican) 2025-03-22 22:13:29.383300 | orchestrator | changed: [testbed-manager] => (item=ceilometer) 2025-03-22 22:13:29.383321 | orchestrator | changed: [testbed-manager] => (item=cinder) 2025-03-22 22:13:29.383372 | orchestrator | changed: [testbed-manager] => (item=common) 2025-03-22 22:13:29.383444 | orchestrator | changed: [testbed-manager] => (item=designate) 2025-03-22 22:13:29.386505 | orchestrator | changed: [testbed-manager] => (item=glance) 2025-03-22 22:13:29.387160 | orchestrator | changed: [testbed-manager] => (item=grafana) 2025-03-22 22:13:29.387784 | orchestrator | changed: [testbed-manager] => (item=horizon) 2025-03-22 22:13:29.388867 | orchestrator | changed: [testbed-manager] => (item=ironic) 2025-03-22 22:13:29.389493 | orchestrator | changed: [testbed-manager] => (item=loadbalancer) 2025-03-22 22:13:29.390335 | orchestrator | changed: [testbed-manager] => (item=magnum) 2025-03-22 22:13:29.390861 | orchestrator | changed: [testbed-manager] => (item=mariadb) 2025-03-22 22:13:29.391898 | orchestrator | changed: [testbed-manager] => (item=memcached) 2025-03-22 22:13:29.392535 | orchestrator | changed: [testbed-manager] => (item=neutron) 2025-03-22 22:13:29.393007 | orchestrator | changed: [testbed-manager] => (item=nova) 2025-03-22 22:13:29.394099 | orchestrator | changed: [testbed-manager] => (item=octavia) 2025-03-22 22:13:29.394895 | orchestrator | changed: [testbed-manager] => (item=opensearch) 2025-03-22 22:13:29.395286 | orchestrator | changed: [testbed-manager] => (item=openvswitch) 2025-03-22 22:13:29.396331 | orchestrator | changed: [testbed-manager] => (item=ovn) 2025-03-22 22:13:29.396874 | orchestrator | changed: [testbed-manager] => (item=placement) 2025-03-22 22:13:29.396899 | orchestrator | changed: [testbed-manager] => (item=rabbitmq) 2025-03-22 22:13:29.397493 | orchestrator | changed: [testbed-manager] => (item=redis) 2025-03-22 22:13:29.398237 | orchestrator | changed: [testbed-manager] => (item=skyline) 2025-03-22 22:13:29.398707 | orchestrator | 2025-03-22 22:13:29.399901 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:13:29.400522 | orchestrator | 2025-03-22 22:13:29 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:13:29.400546 | orchestrator | 2025-03-22 22:13:29 | INFO  | Please wait and do not abort execution. 2025-03-22 22:13:29.400566 | orchestrator | testbed-manager : ok=2  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 22:13:29.400651 | orchestrator | 2025-03-22 22:13:29.401396 | orchestrator | Saturday 22 March 2025 22:13:29 +0000 (0:00:59.204) 0:01:38.699 ******** 2025-03-22 22:13:29.401938 | orchestrator | =============================================================================== 2025-03-22 22:13:29.402695 | orchestrator | Pull other images ------------------------------------------------------ 59.20s 2025-03-22 22:13:29.403059 | orchestrator | Pull keystone image ---------------------------------------------------- 39.33s 2025-03-22 22:13:31.708863 | orchestrator | 2025-03-22 22:13:31 | INFO  | Trying to run play wipe-partitions in environment custom 2025-03-22 22:13:31.760427 | orchestrator | 2025-03-22 22:13:31 | INFO  | Task 6b54552e-dcab-49d7-928b-2dffe59f47d2 (wipe-partitions) was prepared for execution. 2025-03-22 22:13:36.029615 | orchestrator | 2025-03-22 22:13:31 | INFO  | It takes a moment until task 6b54552e-dcab-49d7-928b-2dffe59f47d2 (wipe-partitions) has been started and output is visible here. 2025-03-22 22:13:36.029756 | orchestrator | 2025-03-22 22:13:36.030957 | orchestrator | PLAY [Wipe partitions] ********************************************************* 2025-03-22 22:13:36.033848 | orchestrator | 2025-03-22 22:13:36.035853 | orchestrator | TASK [Find all logical devices owned by UID 167] ******************************* 2025-03-22 22:13:36.035901 | orchestrator | Saturday 22 March 2025 22:13:36 +0000 (0:00:00.185) 0:00:00.185 ******** 2025-03-22 22:13:36.881431 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:13:36.881659 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:13:36.881685 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:13:36.881700 | orchestrator | 2025-03-22 22:13:36.881723 | orchestrator | TASK [Remove all rook related logical devices] ********************************* 2025-03-22 22:13:36.881996 | orchestrator | Saturday 22 March 2025 22:13:36 +0000 (0:00:00.848) 0:00:01.034 ******** 2025-03-22 22:13:37.053068 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:13:37.156862 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:13:37.157425 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:13:37.157463 | orchestrator | 2025-03-22 22:13:37.158794 | orchestrator | TASK [Find all logical devices with prefix ceph] ******************************* 2025-03-22 22:13:37.159234 | orchestrator | Saturday 22 March 2025 22:13:37 +0000 (0:00:00.280) 0:00:01.314 ******** 2025-03-22 22:13:37.979241 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:13:37.981922 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:13:37.982122 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:13:37.982426 | orchestrator | 2025-03-22 22:13:37.985679 | orchestrator | TASK [Remove all ceph related logical devices] ********************************* 2025-03-22 22:13:37.985773 | orchestrator | Saturday 22 March 2025 22:13:37 +0000 (0:00:00.818) 0:00:02.133 ******** 2025-03-22 22:13:38.206217 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:13:38.361055 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:13:38.363917 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:13:38.363951 | orchestrator | 2025-03-22 22:13:39.763361 | orchestrator | TASK [Check device availability] *********************************************** 2025-03-22 22:13:39.764133 | orchestrator | Saturday 22 March 2025 22:13:38 +0000 (0:00:00.384) 0:00:02.518 ******** 2025-03-22 22:13:39.764235 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-03-22 22:13:39.764308 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-03-22 22:13:39.764330 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-03-22 22:13:39.764533 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-03-22 22:13:39.764965 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-03-22 22:13:39.765316 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-03-22 22:13:39.766253 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-03-22 22:13:39.766492 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-03-22 22:13:39.767042 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-03-22 22:13:39.767752 | orchestrator | 2025-03-22 22:13:39.769522 | orchestrator | TASK [Wipe partitions with wipefs] ********************************************* 2025-03-22 22:13:39.769765 | orchestrator | Saturday 22 March 2025 22:13:39 +0000 (0:00:01.403) 0:00:03.921 ******** 2025-03-22 22:13:41.225531 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdb) 2025-03-22 22:13:41.225692 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdb) 2025-03-22 22:13:41.226827 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdb) 2025-03-22 22:13:41.228729 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdc) 2025-03-22 22:13:41.229044 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdc) 2025-03-22 22:13:41.229075 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdc) 2025-03-22 22:13:41.229558 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdd) 2025-03-22 22:13:41.230216 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdd) 2025-03-22 22:13:41.230399 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdd) 2025-03-22 22:13:41.231068 | orchestrator | 2025-03-22 22:13:41.231384 | orchestrator | TASK [Overwrite first 32M with zeros] ****************************************** 2025-03-22 22:13:41.231730 | orchestrator | Saturday 22 March 2025 22:13:41 +0000 (0:00:01.462) 0:00:05.384 ******** 2025-03-22 22:13:43.702905 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-03-22 22:13:43.703405 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-03-22 22:13:43.703444 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-03-22 22:13:43.703903 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-03-22 22:13:43.704557 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-03-22 22:13:43.705857 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-03-22 22:13:43.709558 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-03-22 22:13:44.456137 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-03-22 22:13:44.456292 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-03-22 22:13:44.456310 | orchestrator | 2025-03-22 22:13:44.456325 | orchestrator | TASK [Reload udev rules] ******************************************************* 2025-03-22 22:13:44.456383 | orchestrator | Saturday 22 March 2025 22:13:43 +0000 (0:00:02.472) 0:00:07.857 ******** 2025-03-22 22:13:44.456443 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:13:44.456523 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:13:44.457215 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:13:44.457803 | orchestrator | 2025-03-22 22:13:44.458860 | orchestrator | TASK [Request device events from the kernel] *********************************** 2025-03-22 22:13:44.462987 | orchestrator | Saturday 22 March 2025 22:13:44 +0000 (0:00:00.757) 0:00:08.614 ******** 2025-03-22 22:13:45.169222 | orchestrator | changed: [testbed-node-3] 2025-03-22 22:13:45.169985 | orchestrator | changed: [testbed-node-4] 2025-03-22 22:13:45.170637 | orchestrator | changed: [testbed-node-5] 2025-03-22 22:13:45.171918 | orchestrator | 2025-03-22 22:13:45.172531 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:13:45.173412 | orchestrator | 2025-03-22 22:13:45 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:13:45.174109 | orchestrator | 2025-03-22 22:13:45 | INFO  | Please wait and do not abort execution. 2025-03-22 22:13:45.175723 | orchestrator | testbed-node-3 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:13:45.177338 | orchestrator | testbed-node-4 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:13:45.178317 | orchestrator | testbed-node-5 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:13:45.179651 | orchestrator | 2025-03-22 22:13:45.180440 | orchestrator | Saturday 22 March 2025 22:13:45 +0000 (0:00:00.712) 0:00:09.326 ******** 2025-03-22 22:13:45.182055 | orchestrator | =============================================================================== 2025-03-22 22:13:45.183259 | orchestrator | Overwrite first 32M with zeros ------------------------------------------ 2.47s 2025-03-22 22:13:45.183695 | orchestrator | Wipe partitions with wipefs --------------------------------------------- 1.46s 2025-03-22 22:13:45.184170 | orchestrator | Check device availability ----------------------------------------------- 1.40s 2025-03-22 22:13:45.185530 | orchestrator | Find all logical devices owned by UID 167 ------------------------------- 0.85s 2025-03-22 22:13:45.186733 | orchestrator | Find all logical devices with prefix ceph ------------------------------- 0.82s 2025-03-22 22:13:45.187107 | orchestrator | Reload udev rules ------------------------------------------------------- 0.76s 2025-03-22 22:13:45.187493 | orchestrator | Request device events from the kernel ----------------------------------- 0.71s 2025-03-22 22:13:45.188049 | orchestrator | Remove all ceph related logical devices --------------------------------- 0.38s 2025-03-22 22:13:45.188852 | orchestrator | Remove all rook related logical devices --------------------------------- 0.28s 2025-03-22 22:13:48.198830 | orchestrator | 2025-03-22 22:13:48 | INFO  | Task 59501b26-db91-4a2a-9853-364e00d51dd2 (facts) was prepared for execution. 2025-03-22 22:13:53.016637 | orchestrator | 2025-03-22 22:13:48 | INFO  | It takes a moment until task 59501b26-db91-4a2a-9853-364e00d51dd2 (facts) has been started and output is visible here. 2025-03-22 22:13:53.016769 | orchestrator | 2025-03-22 22:13:53.017679 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-03-22 22:13:53.017705 | orchestrator | 2025-03-22 22:13:53.017724 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-03-22 22:13:53.020220 | orchestrator | Saturday 22 March 2025 22:13:53 +0000 (0:00:00.263) 0:00:00.263 ******** 2025-03-22 22:13:54.205465 | orchestrator | ok: [testbed-manager] 2025-03-22 22:13:54.205639 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:13:54.205664 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:13:54.206303 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:13:54.206660 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:13:54.206988 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:13:54.207665 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:13:54.208443 | orchestrator | 2025-03-22 22:13:54.210342 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-03-22 22:13:54.210450 | orchestrator | Saturday 22 March 2025 22:13:54 +0000 (0:00:01.191) 0:00:01.454 ******** 2025-03-22 22:13:54.381224 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:13:54.464607 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:13:54.537137 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:13:54.643639 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:13:54.717749 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:13:55.513527 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:13:55.513875 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:13:55.513919 | orchestrator | 2025-03-22 22:13:55.514271 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-22 22:13:55.514593 | orchestrator | 2025-03-22 22:13:55.514977 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-22 22:13:55.515601 | orchestrator | Saturday 22 March 2025 22:13:55 +0000 (0:00:01.310) 0:00:02.764 ******** 2025-03-22 22:14:00.294061 | orchestrator | ok: [testbed-node-2] 2025-03-22 22:14:00.295397 | orchestrator | ok: [testbed-node-0] 2025-03-22 22:14:00.296757 | orchestrator | ok: [testbed-node-1] 2025-03-22 22:14:00.299499 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:14:00.299587 | orchestrator | ok: [testbed-manager] 2025-03-22 22:14:00.300728 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:14:00.301860 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:14:00.303291 | orchestrator | 2025-03-22 22:14:00.304327 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-03-22 22:14:00.305342 | orchestrator | 2025-03-22 22:14:00.306662 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-03-22 22:14:00.675634 | orchestrator | Saturday 22 March 2025 22:14:00 +0000 (0:00:04.780) 0:00:07.545 ******** 2025-03-22 22:14:00.675692 | orchestrator | skipping: [testbed-manager] 2025-03-22 22:14:00.770190 | orchestrator | skipping: [testbed-node-0] 2025-03-22 22:14:00.888600 | orchestrator | skipping: [testbed-node-1] 2025-03-22 22:14:00.979984 | orchestrator | skipping: [testbed-node-2] 2025-03-22 22:14:01.070121 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:01.106171 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:01.107322 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:01.108692 | orchestrator | 2025-03-22 22:14:01.110593 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:14:01.112285 | orchestrator | 2025-03-22 22:14:01 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:14:01.112301 | orchestrator | 2025-03-22 22:14:01 | INFO  | Please wait and do not abort execution. 2025-03-22 22:14:01.112311 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:14:01.113424 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:14:01.114765 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:14:01.115763 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:14:01.117151 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:14:01.117901 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:14:01.118770 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 22:14:01.119245 | orchestrator | 2025-03-22 22:14:01.119684 | orchestrator | Saturday 22 March 2025 22:14:01 +0000 (0:00:00.813) 0:00:08.358 ******** 2025-03-22 22:14:01.120158 | orchestrator | =============================================================================== 2025-03-22 22:14:01.120654 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.78s 2025-03-22 22:14:01.121114 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.31s 2025-03-22 22:14:01.121579 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.19s 2025-03-22 22:14:01.122473 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.81s 2025-03-22 22:14:03.723320 | orchestrator | 2025-03-22 22:14:03 | INFO  | Task d27f5c59-f954-47de-ae54-9f6020f47ce8 (ceph-configure-lvm-volumes) was prepared for execution. 2025-03-22 22:14:08.446349 | orchestrator | 2025-03-22 22:14:03 | INFO  | It takes a moment until task d27f5c59-f954-47de-ae54-9f6020f47ce8 (ceph-configure-lvm-volumes) has been started and output is visible here. 2025-03-22 22:14:08.446457 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.12 2025-03-22 22:14:09.452248 | orchestrator | 2025-03-22 22:14:09.454621 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-03-22 22:14:09.459590 | orchestrator | 2025-03-22 22:14:09.961654 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-22 22:14:09.961758 | orchestrator | Saturday 22 March 2025 22:14:09 +0000 (0:00:00.839) 0:00:00.839 ******** 2025-03-22 22:14:09.961791 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-03-22 22:14:09.961865 | orchestrator | 2025-03-22 22:14:09.967731 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-22 22:14:10.353117 | orchestrator | Saturday 22 March 2025 22:14:09 +0000 (0:00:00.512) 0:00:01.352 ******** 2025-03-22 22:14:10.353923 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:14:10.355467 | orchestrator | 2025-03-22 22:14:10.358519 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:11.132400 | orchestrator | Saturday 22 March 2025 22:14:10 +0000 (0:00:00.392) 0:00:01.745 ******** 2025-03-22 22:14:11.132512 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-03-22 22:14:11.133135 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-03-22 22:14:11.134247 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-03-22 22:14:11.135278 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-03-22 22:14:11.136025 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-03-22 22:14:11.141116 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-03-22 22:14:11.142127 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-03-22 22:14:11.142155 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-03-22 22:14:11.142196 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-03-22 22:14:11.143302 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-03-22 22:14:11.143766 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-03-22 22:14:11.146347 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-03-22 22:14:11.147430 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-03-22 22:14:11.147519 | orchestrator | 2025-03-22 22:14:11.147640 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:11.149587 | orchestrator | Saturday 22 March 2025 22:14:11 +0000 (0:00:00.778) 0:00:02.523 ******** 2025-03-22 22:14:11.367022 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:11.367212 | orchestrator | 2025-03-22 22:14:11.367239 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:11.368220 | orchestrator | Saturday 22 March 2025 22:14:11 +0000 (0:00:00.237) 0:00:02.761 ******** 2025-03-22 22:14:11.587129 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:11.588342 | orchestrator | 2025-03-22 22:14:11.589294 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:11.590524 | orchestrator | Saturday 22 March 2025 22:14:11 +0000 (0:00:00.219) 0:00:02.980 ******** 2025-03-22 22:14:11.861113 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:11.863008 | orchestrator | 2025-03-22 22:14:11.865421 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:11.866161 | orchestrator | Saturday 22 March 2025 22:14:11 +0000 (0:00:00.272) 0:00:03.253 ******** 2025-03-22 22:14:12.118257 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:12.118770 | orchestrator | 2025-03-22 22:14:12.120100 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:12.120491 | orchestrator | Saturday 22 March 2025 22:14:12 +0000 (0:00:00.258) 0:00:03.511 ******** 2025-03-22 22:14:12.346981 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:12.347356 | orchestrator | 2025-03-22 22:14:12.348154 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:12.348447 | orchestrator | Saturday 22 March 2025 22:14:12 +0000 (0:00:00.223) 0:00:03.735 ******** 2025-03-22 22:14:12.609267 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:12.609630 | orchestrator | 2025-03-22 22:14:12.612295 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:12.612329 | orchestrator | Saturday 22 March 2025 22:14:12 +0000 (0:00:00.266) 0:00:04.001 ******** 2025-03-22 22:14:12.825029 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:12.825698 | orchestrator | 2025-03-22 22:14:12.827308 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:12.828240 | orchestrator | Saturday 22 March 2025 22:14:12 +0000 (0:00:00.217) 0:00:04.219 ******** 2025-03-22 22:14:13.091909 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:13.092504 | orchestrator | 2025-03-22 22:14:13.092901 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:13.094121 | orchestrator | Saturday 22 March 2025 22:14:13 +0000 (0:00:00.266) 0:00:04.485 ******** 2025-03-22 22:14:13.805751 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_f0064d2e-b937-487e-87eb-c5dccc0148b3) 2025-03-22 22:14:13.808341 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_f0064d2e-b937-487e-87eb-c5dccc0148b3) 2025-03-22 22:14:13.808675 | orchestrator | 2025-03-22 22:14:13.809403 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:13.812395 | orchestrator | Saturday 22 March 2025 22:14:13 +0000 (0:00:00.713) 0:00:05.198 ******** 2025-03-22 22:14:14.789986 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_637f4fb2-465d-4aa6-a08d-716b8ef59fde) 2025-03-22 22:14:14.790403 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_637f4fb2-465d-4aa6-a08d-716b8ef59fde) 2025-03-22 22:14:14.790670 | orchestrator | 2025-03-22 22:14:14.790698 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:14.791010 | orchestrator | Saturday 22 March 2025 22:14:14 +0000 (0:00:00.984) 0:00:06.183 ******** 2025-03-22 22:14:15.350389 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_961657b8-7922-4be7-b7ea-8a6546d88057) 2025-03-22 22:14:15.351386 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_961657b8-7922-4be7-b7ea-8a6546d88057) 2025-03-22 22:14:15.353550 | orchestrator | 2025-03-22 22:14:15.353809 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:15.353914 | orchestrator | Saturday 22 March 2025 22:14:15 +0000 (0:00:00.559) 0:00:06.742 ******** 2025-03-22 22:14:15.899574 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ffd44099-c386-47c0-8dc0-30cf9a71e0b5) 2025-03-22 22:14:15.900280 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ffd44099-c386-47c0-8dc0-30cf9a71e0b5) 2025-03-22 22:14:15.900318 | orchestrator | 2025-03-22 22:14:15.900378 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:15.900445 | orchestrator | Saturday 22 March 2025 22:14:15 +0000 (0:00:00.551) 0:00:07.293 ******** 2025-03-22 22:14:16.288150 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-22 22:14:16.289344 | orchestrator | 2025-03-22 22:14:16.292967 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:16.293872 | orchestrator | Saturday 22 March 2025 22:14:16 +0000 (0:00:00.385) 0:00:07.679 ******** 2025-03-22 22:14:16.785254 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-03-22 22:14:16.786128 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-03-22 22:14:16.787651 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-03-22 22:14:16.790094 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-03-22 22:14:16.791369 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-03-22 22:14:16.793251 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-03-22 22:14:16.796337 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-03-22 22:14:16.798608 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-03-22 22:14:16.799860 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-03-22 22:14:16.800917 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-03-22 22:14:16.801642 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-03-22 22:14:16.802387 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-03-22 22:14:16.804987 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-03-22 22:14:16.805686 | orchestrator | 2025-03-22 22:14:16.806409 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:16.807058 | orchestrator | Saturday 22 March 2025 22:14:16 +0000 (0:00:00.498) 0:00:08.177 ******** 2025-03-22 22:14:17.163991 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:17.164550 | orchestrator | 2025-03-22 22:14:17.164585 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:17.165711 | orchestrator | Saturday 22 March 2025 22:14:17 +0000 (0:00:00.378) 0:00:08.556 ******** 2025-03-22 22:14:17.448266 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:17.448627 | orchestrator | 2025-03-22 22:14:17.452427 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:17.454119 | orchestrator | Saturday 22 March 2025 22:14:17 +0000 (0:00:00.280) 0:00:08.836 ******** 2025-03-22 22:14:17.696750 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:17.698464 | orchestrator | 2025-03-22 22:14:17.698499 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:17.698721 | orchestrator | Saturday 22 March 2025 22:14:17 +0000 (0:00:00.251) 0:00:09.087 ******** 2025-03-22 22:14:17.944867 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:17.945147 | orchestrator | 2025-03-22 22:14:17.945199 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:17.945235 | orchestrator | Saturday 22 March 2025 22:14:17 +0000 (0:00:00.247) 0:00:09.334 ******** 2025-03-22 22:14:18.690489 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:18.691801 | orchestrator | 2025-03-22 22:14:18.691864 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:18.696075 | orchestrator | Saturday 22 March 2025 22:14:18 +0000 (0:00:00.749) 0:00:10.084 ******** 2025-03-22 22:14:18.973627 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:19.240060 | orchestrator | 2025-03-22 22:14:19.240124 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:19.240141 | orchestrator | Saturday 22 March 2025 22:14:18 +0000 (0:00:00.280) 0:00:10.364 ******** 2025-03-22 22:14:19.240165 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:19.243080 | orchestrator | 2025-03-22 22:14:19.245302 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:19.245621 | orchestrator | Saturday 22 March 2025 22:14:19 +0000 (0:00:00.267) 0:00:10.632 ******** 2025-03-22 22:14:19.493283 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:19.493983 | orchestrator | 2025-03-22 22:14:19.494521 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:19.495059 | orchestrator | Saturday 22 March 2025 22:14:19 +0000 (0:00:00.254) 0:00:10.886 ******** 2025-03-22 22:14:20.328260 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-03-22 22:14:20.328689 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-03-22 22:14:20.330055 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-03-22 22:14:20.330374 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-03-22 22:14:20.330942 | orchestrator | 2025-03-22 22:14:20.331707 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:20.332105 | orchestrator | Saturday 22 March 2025 22:14:20 +0000 (0:00:00.834) 0:00:11.721 ******** 2025-03-22 22:14:20.584057 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:20.584836 | orchestrator | 2025-03-22 22:14:20.585133 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:20.585421 | orchestrator | Saturday 22 March 2025 22:14:20 +0000 (0:00:00.256) 0:00:11.977 ******** 2025-03-22 22:14:20.820563 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:20.821413 | orchestrator | 2025-03-22 22:14:20.823140 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:20.823879 | orchestrator | Saturday 22 March 2025 22:14:20 +0000 (0:00:00.236) 0:00:12.213 ******** 2025-03-22 22:14:21.080601 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:21.081082 | orchestrator | 2025-03-22 22:14:21.082529 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:21.083780 | orchestrator | Saturday 22 March 2025 22:14:21 +0000 (0:00:00.261) 0:00:12.474 ******** 2025-03-22 22:14:21.338155 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:21.339230 | orchestrator | 2025-03-22 22:14:21.340252 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-03-22 22:14:21.341350 | orchestrator | Saturday 22 March 2025 22:14:21 +0000 (0:00:00.254) 0:00:12.729 ******** 2025-03-22 22:14:21.562640 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': None}) 2025-03-22 22:14:21.566356 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': None}) 2025-03-22 22:14:21.569603 | orchestrator | 2025-03-22 22:14:21.569644 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-03-22 22:14:21.569665 | orchestrator | Saturday 22 March 2025 22:14:21 +0000 (0:00:00.226) 0:00:12.956 ******** 2025-03-22 22:14:21.960665 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:21.960875 | orchestrator | 2025-03-22 22:14:21.963534 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-03-22 22:14:21.963764 | orchestrator | Saturday 22 March 2025 22:14:21 +0000 (0:00:00.395) 0:00:13.352 ******** 2025-03-22 22:14:22.129643 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:22.132153 | orchestrator | 2025-03-22 22:14:22.134203 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-03-22 22:14:22.136203 | orchestrator | Saturday 22 March 2025 22:14:22 +0000 (0:00:00.165) 0:00:13.517 ******** 2025-03-22 22:14:22.312213 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:22.315319 | orchestrator | 2025-03-22 22:14:22.316527 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-03-22 22:14:22.317570 | orchestrator | Saturday 22 March 2025 22:14:22 +0000 (0:00:00.186) 0:00:13.703 ******** 2025-03-22 22:14:22.551499 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:14:22.553130 | orchestrator | 2025-03-22 22:14:22.556544 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-03-22 22:14:22.556797 | orchestrator | Saturday 22 March 2025 22:14:22 +0000 (0:00:00.240) 0:00:13.944 ******** 2025-03-22 22:14:22.875815 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '43a113f0-cd75-588c-85b3-7699e063bb3b'}}) 2025-03-22 22:14:22.876521 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e04c708b-d456-5156-8bd3-75c09a375fc5'}}) 2025-03-22 22:14:22.883572 | orchestrator | 2025-03-22 22:14:22.889156 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-03-22 22:14:22.889635 | orchestrator | Saturday 22 March 2025 22:14:22 +0000 (0:00:00.325) 0:00:14.269 ******** 2025-03-22 22:14:23.180822 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '43a113f0-cd75-588c-85b3-7699e063bb3b'}})  2025-03-22 22:14:23.181202 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e04c708b-d456-5156-8bd3-75c09a375fc5'}})  2025-03-22 22:14:23.181768 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:23.182677 | orchestrator | 2025-03-22 22:14:23.189069 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-03-22 22:14:23.189233 | orchestrator | Saturday 22 March 2025 22:14:23 +0000 (0:00:00.304) 0:00:14.574 ******** 2025-03-22 22:14:23.400513 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '43a113f0-cd75-588c-85b3-7699e063bb3b'}})  2025-03-22 22:14:23.403769 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e04c708b-d456-5156-8bd3-75c09a375fc5'}})  2025-03-22 22:14:23.404107 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:23.404138 | orchestrator | 2025-03-22 22:14:23.404782 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-03-22 22:14:23.404838 | orchestrator | Saturday 22 March 2025 22:14:23 +0000 (0:00:00.214) 0:00:14.789 ******** 2025-03-22 22:14:23.591112 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '43a113f0-cd75-588c-85b3-7699e063bb3b'}})  2025-03-22 22:14:23.592959 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e04c708b-d456-5156-8bd3-75c09a375fc5'}})  2025-03-22 22:14:23.593243 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:23.593587 | orchestrator | 2025-03-22 22:14:23.593911 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-03-22 22:14:23.597350 | orchestrator | Saturday 22 March 2025 22:14:23 +0000 (0:00:00.196) 0:00:14.985 ******** 2025-03-22 22:14:23.781147 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:14:23.782989 | orchestrator | 2025-03-22 22:14:23.785351 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-03-22 22:14:23.785442 | orchestrator | Saturday 22 March 2025 22:14:23 +0000 (0:00:00.189) 0:00:15.175 ******** 2025-03-22 22:14:23.952039 | orchestrator | ok: [testbed-node-3] 2025-03-22 22:14:23.952532 | orchestrator | 2025-03-22 22:14:23.952571 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-03-22 22:14:23.953649 | orchestrator | Saturday 22 March 2025 22:14:23 +0000 (0:00:00.168) 0:00:15.344 ******** 2025-03-22 22:14:24.134066 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:24.135482 | orchestrator | 2025-03-22 22:14:24.136332 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-03-22 22:14:24.136615 | orchestrator | Saturday 22 March 2025 22:14:24 +0000 (0:00:00.181) 0:00:15.526 ******** 2025-03-22 22:14:24.289652 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:24.290336 | orchestrator | 2025-03-22 22:14:24.290934 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-03-22 22:14:24.291720 | orchestrator | Saturday 22 March 2025 22:14:24 +0000 (0:00:00.157) 0:00:15.684 ******** 2025-03-22 22:14:24.686693 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:24.688770 | orchestrator | 2025-03-22 22:14:24.851965 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-03-22 22:14:24.852051 | orchestrator | Saturday 22 March 2025 22:14:24 +0000 (0:00:00.396) 0:00:16.080 ******** 2025-03-22 22:14:24.852080 | orchestrator | ok: [testbed-node-3] => { 2025-03-22 22:14:24.853680 | orchestrator |  "ceph_osd_devices": { 2025-03-22 22:14:24.854326 | orchestrator |  "sdb": { 2025-03-22 22:14:24.855534 | orchestrator |  "osd_lvm_uuid": "43a113f0-cd75-588c-85b3-7699e063bb3b" 2025-03-22 22:14:24.857450 | orchestrator |  }, 2025-03-22 22:14:24.858300 | orchestrator |  "sdc": { 2025-03-22 22:14:24.859335 | orchestrator |  "osd_lvm_uuid": "e04c708b-d456-5156-8bd3-75c09a375fc5" 2025-03-22 22:14:24.859380 | orchestrator |  } 2025-03-22 22:14:24.860380 | orchestrator |  } 2025-03-22 22:14:24.861331 | orchestrator | } 2025-03-22 22:14:24.862791 | orchestrator | 2025-03-22 22:14:24.863713 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-03-22 22:14:24.863936 | orchestrator | Saturday 22 March 2025 22:14:24 +0000 (0:00:00.165) 0:00:16.245 ******** 2025-03-22 22:14:25.009028 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:25.011112 | orchestrator | 2025-03-22 22:14:25.011231 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-03-22 22:14:25.011582 | orchestrator | Saturday 22 March 2025 22:14:25 +0000 (0:00:00.156) 0:00:16.402 ******** 2025-03-22 22:14:25.162286 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:25.162681 | orchestrator | 2025-03-22 22:14:25.163650 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-03-22 22:14:25.165429 | orchestrator | Saturday 22 March 2025 22:14:25 +0000 (0:00:00.153) 0:00:16.556 ******** 2025-03-22 22:14:25.317726 | orchestrator | skipping: [testbed-node-3] 2025-03-22 22:14:25.318439 | orchestrator | 2025-03-22 22:14:25.319084 | orchestrator | TASK [Print configuration data] ************************************************ 2025-03-22 22:14:25.319381 | orchestrator | Saturday 22 March 2025 22:14:25 +0000 (0:00:00.154) 0:00:16.710 ******** 2025-03-22 22:14:25.650937 | orchestrator | changed: [testbed-node-3] => { 2025-03-22 22:14:25.652322 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-03-22 22:14:25.654445 | orchestrator |  "ceph_osd_devices": { 2025-03-22 22:14:25.657415 | orchestrator |  "sdb": { 2025-03-22 22:14:25.658773 | orchestrator |  "osd_lvm_uuid": "43a113f0-cd75-588c-85b3-7699e063bb3b" 2025-03-22 22:14:25.660996 | orchestrator |  }, 2025-03-22 22:14:25.661774 | orchestrator |  "sdc": { 2025-03-22 22:14:25.662827 | orchestrator |  "osd_lvm_uuid": "e04c708b-d456-5156-8bd3-75c09a375fc5" 2025-03-22 22:14:25.663390 | orchestrator |  } 2025-03-22 22:14:25.667111 | orchestrator |  }, 2025-03-22 22:14:25.669635 | orchestrator |  "lvm_volumes": [ 2025-03-22 22:14:25.669972 | orchestrator |  { 2025-03-22 22:14:25.670683 | orchestrator |  "data": "osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b", 2025-03-22 22:14:25.672286 | orchestrator |  "data_vg": "ceph-43a113f0-cd75-588c-85b3-7699e063bb3b" 2025-03-22 22:14:25.674352 | orchestrator |  }, 2025-03-22 22:14:25.677398 | orchestrator |  { 2025-03-22 22:14:25.678148 | orchestrator |  "data": "osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5", 2025-03-22 22:14:25.679675 | orchestrator |  "data_vg": "ceph-e04c708b-d456-5156-8bd3-75c09a375fc5" 2025-03-22 22:14:25.680110 | orchestrator |  } 2025-03-22 22:14:25.681044 | orchestrator |  ] 2025-03-22 22:14:25.682263 | orchestrator |  } 2025-03-22 22:14:25.683162 | orchestrator | } 2025-03-22 22:14:25.683212 | orchestrator | 2025-03-22 22:14:25.683233 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-03-22 22:14:25.683715 | orchestrator | Saturday 22 March 2025 22:14:25 +0000 (0:00:00.332) 0:00:17.043 ******** 2025-03-22 22:14:28.088013 | orchestrator | changed: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-03-22 22:14:28.088202 | orchestrator | 2025-03-22 22:14:28.088715 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-03-22 22:14:28.089124 | orchestrator | 2025-03-22 22:14:28.096704 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-22 22:14:28.367722 | orchestrator | Saturday 22 March 2025 22:14:28 +0000 (0:00:02.436) 0:00:19.479 ******** 2025-03-22 22:14:28.367846 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-03-22 22:14:28.368679 | orchestrator | 2025-03-22 22:14:28.368718 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-22 22:14:28.369906 | orchestrator | Saturday 22 March 2025 22:14:28 +0000 (0:00:00.280) 0:00:19.759 ******** 2025-03-22 22:14:28.647614 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:14:28.649013 | orchestrator | 2025-03-22 22:14:28.650307 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:28.651939 | orchestrator | Saturday 22 March 2025 22:14:28 +0000 (0:00:00.279) 0:00:20.039 ******** 2025-03-22 22:14:29.114692 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-03-22 22:14:29.115396 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-03-22 22:14:29.116143 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-03-22 22:14:29.116358 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-03-22 22:14:29.117347 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-03-22 22:14:29.117957 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-03-22 22:14:29.118597 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-03-22 22:14:29.118754 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-03-22 22:14:29.122400 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-03-22 22:14:29.125793 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-03-22 22:14:29.129548 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-03-22 22:14:29.130737 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-03-22 22:14:29.132359 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-03-22 22:14:29.135631 | orchestrator | 2025-03-22 22:14:29.136527 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:29.137110 | orchestrator | Saturday 22 March 2025 22:14:29 +0000 (0:00:00.467) 0:00:20.506 ******** 2025-03-22 22:14:29.353534 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:29.353675 | orchestrator | 2025-03-22 22:14:29.353697 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:29.353719 | orchestrator | Saturday 22 March 2025 22:14:29 +0000 (0:00:00.238) 0:00:20.745 ******** 2025-03-22 22:14:29.555782 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:29.557469 | orchestrator | 2025-03-22 22:14:29.558817 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:29.560040 | orchestrator | Saturday 22 March 2025 22:14:29 +0000 (0:00:00.201) 0:00:20.947 ******** 2025-03-22 22:14:29.979818 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:29.981335 | orchestrator | 2025-03-22 22:14:29.982267 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:29.983206 | orchestrator | Saturday 22 March 2025 22:14:29 +0000 (0:00:00.426) 0:00:21.373 ******** 2025-03-22 22:14:30.211675 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:30.211826 | orchestrator | 2025-03-22 22:14:30.212228 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:30.214110 | orchestrator | Saturday 22 March 2025 22:14:30 +0000 (0:00:00.231) 0:00:21.604 ******** 2025-03-22 22:14:30.432898 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:30.433562 | orchestrator | 2025-03-22 22:14:30.434367 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:30.434844 | orchestrator | Saturday 22 March 2025 22:14:30 +0000 (0:00:00.221) 0:00:21.826 ******** 2025-03-22 22:14:30.664478 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:30.669856 | orchestrator | 2025-03-22 22:14:30.670589 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:30.670621 | orchestrator | Saturday 22 March 2025 22:14:30 +0000 (0:00:00.229) 0:00:22.056 ******** 2025-03-22 22:14:30.894433 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:30.894871 | orchestrator | 2025-03-22 22:14:30.895800 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:30.897985 | orchestrator | Saturday 22 March 2025 22:14:30 +0000 (0:00:00.231) 0:00:22.287 ******** 2025-03-22 22:14:31.156547 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:31.158267 | orchestrator | 2025-03-22 22:14:31.159443 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:31.161636 | orchestrator | Saturday 22 March 2025 22:14:31 +0000 (0:00:00.258) 0:00:22.545 ******** 2025-03-22 22:14:31.622448 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_d5017ffd-d0d5-431d-84a2-17c0a06b39b8) 2025-03-22 22:14:31.622710 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_d5017ffd-d0d5-431d-84a2-17c0a06b39b8) 2025-03-22 22:14:31.623856 | orchestrator | 2025-03-22 22:14:31.624097 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:31.624955 | orchestrator | Saturday 22 March 2025 22:14:31 +0000 (0:00:00.469) 0:00:23.015 ******** 2025-03-22 22:14:32.074688 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_d6bc4934-4c34-4893-bb71-8a867393eb36) 2025-03-22 22:14:32.075076 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_d6bc4934-4c34-4893-bb71-8a867393eb36) 2025-03-22 22:14:32.076733 | orchestrator | 2025-03-22 22:14:32.077347 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:32.078129 | orchestrator | Saturday 22 March 2025 22:14:32 +0000 (0:00:00.449) 0:00:23.465 ******** 2025-03-22 22:14:32.505421 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_6454730d-d769-486b-8e2e-775b81470741) 2025-03-22 22:14:32.507248 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_6454730d-d769-486b-8e2e-775b81470741) 2025-03-22 22:14:32.508668 | orchestrator | 2025-03-22 22:14:32.510939 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:32.511222 | orchestrator | Saturday 22 March 2025 22:14:32 +0000 (0:00:00.432) 0:00:23.897 ******** 2025-03-22 22:14:33.193968 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_65c63fa5-5e6f-4c5d-b367-79c528cb404f) 2025-03-22 22:14:33.194308 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_65c63fa5-5e6f-4c5d-b367-79c528cb404f) 2025-03-22 22:14:33.194346 | orchestrator | 2025-03-22 22:14:33.194955 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:33.196382 | orchestrator | Saturday 22 March 2025 22:14:33 +0000 (0:00:00.688) 0:00:24.586 ******** 2025-03-22 22:14:34.045471 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-22 22:14:34.046472 | orchestrator | 2025-03-22 22:14:34.046662 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:34.047382 | orchestrator | Saturday 22 March 2025 22:14:34 +0000 (0:00:00.852) 0:00:25.438 ******** 2025-03-22 22:14:34.542071 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-03-22 22:14:34.542263 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-03-22 22:14:34.542717 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-03-22 22:14:34.543285 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-03-22 22:14:34.543945 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-03-22 22:14:34.544619 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-03-22 22:14:34.545093 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-03-22 22:14:34.545850 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-03-22 22:14:34.549275 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-03-22 22:14:34.550110 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-03-22 22:14:34.550375 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-03-22 22:14:34.551313 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-03-22 22:14:34.551703 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-03-22 22:14:34.552114 | orchestrator | 2025-03-22 22:14:34.552540 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:34.555385 | orchestrator | Saturday 22 March 2025 22:14:34 +0000 (0:00:00.494) 0:00:25.932 ******** 2025-03-22 22:14:34.782562 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:34.783139 | orchestrator | 2025-03-22 22:14:34.783625 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:34.784599 | orchestrator | Saturday 22 March 2025 22:14:34 +0000 (0:00:00.243) 0:00:26.176 ******** 2025-03-22 22:14:35.025960 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:35.026281 | orchestrator | 2025-03-22 22:14:35.026319 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:35.027005 | orchestrator | Saturday 22 March 2025 22:14:35 +0000 (0:00:00.242) 0:00:26.419 ******** 2025-03-22 22:14:35.307481 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:35.307622 | orchestrator | 2025-03-22 22:14:35.308294 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:35.308803 | orchestrator | Saturday 22 March 2025 22:14:35 +0000 (0:00:00.282) 0:00:26.701 ******** 2025-03-22 22:14:35.552316 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:35.552824 | orchestrator | 2025-03-22 22:14:35.552855 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:35.553263 | orchestrator | Saturday 22 March 2025 22:14:35 +0000 (0:00:00.243) 0:00:26.944 ******** 2025-03-22 22:14:35.765727 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:35.768333 | orchestrator | 2025-03-22 22:14:35.768441 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:35.768477 | orchestrator | Saturday 22 March 2025 22:14:35 +0000 (0:00:00.211) 0:00:27.156 ******** 2025-03-22 22:14:35.994461 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:35.995820 | orchestrator | 2025-03-22 22:14:35.997106 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:35.999246 | orchestrator | Saturday 22 March 2025 22:14:35 +0000 (0:00:00.231) 0:00:27.387 ******** 2025-03-22 22:14:36.222484 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:36.223240 | orchestrator | 2025-03-22 22:14:36.223287 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:36.451879 | orchestrator | Saturday 22 March 2025 22:14:36 +0000 (0:00:00.228) 0:00:27.616 ******** 2025-03-22 22:14:36.452013 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:36.452109 | orchestrator | 2025-03-22 22:14:36.452915 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:36.452945 | orchestrator | Saturday 22 March 2025 22:14:36 +0000 (0:00:00.227) 0:00:27.843 ******** 2025-03-22 22:14:37.682714 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-03-22 22:14:37.682876 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-03-22 22:14:37.684062 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-03-22 22:14:37.684946 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-03-22 22:14:37.685939 | orchestrator | 2025-03-22 22:14:37.686464 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:37.687376 | orchestrator | Saturday 22 March 2025 22:14:37 +0000 (0:00:01.231) 0:00:29.074 ******** 2025-03-22 22:14:37.948306 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:37.949100 | orchestrator | 2025-03-22 22:14:37.949142 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:37.949518 | orchestrator | Saturday 22 March 2025 22:14:37 +0000 (0:00:00.266) 0:00:29.341 ******** 2025-03-22 22:14:38.203659 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:38.204305 | orchestrator | 2025-03-22 22:14:38.204379 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:38.205419 | orchestrator | Saturday 22 March 2025 22:14:38 +0000 (0:00:00.255) 0:00:29.597 ******** 2025-03-22 22:14:38.436034 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:38.436435 | orchestrator | 2025-03-22 22:14:38.436692 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:38.437313 | orchestrator | Saturday 22 March 2025 22:14:38 +0000 (0:00:00.231) 0:00:29.829 ******** 2025-03-22 22:14:38.665209 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:38.666109 | orchestrator | 2025-03-22 22:14:38.667419 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-03-22 22:14:38.891586 | orchestrator | Saturday 22 March 2025 22:14:38 +0000 (0:00:00.228) 0:00:30.057 ******** 2025-03-22 22:14:38.891703 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': None}) 2025-03-22 22:14:38.892006 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': None}) 2025-03-22 22:14:38.892489 | orchestrator | 2025-03-22 22:14:38.893263 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-03-22 22:14:38.894650 | orchestrator | Saturday 22 March 2025 22:14:38 +0000 (0:00:00.226) 0:00:30.284 ******** 2025-03-22 22:14:39.038489 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:39.038880 | orchestrator | 2025-03-22 22:14:39.039875 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-03-22 22:14:39.040710 | orchestrator | Saturday 22 March 2025 22:14:39 +0000 (0:00:00.146) 0:00:30.430 ******** 2025-03-22 22:14:39.197406 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:39.197855 | orchestrator | 2025-03-22 22:14:39.198074 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-03-22 22:14:39.198413 | orchestrator | Saturday 22 March 2025 22:14:39 +0000 (0:00:00.159) 0:00:30.590 ******** 2025-03-22 22:14:39.357439 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:39.357921 | orchestrator | 2025-03-22 22:14:39.358337 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-03-22 22:14:39.359044 | orchestrator | Saturday 22 March 2025 22:14:39 +0000 (0:00:00.159) 0:00:30.750 ******** 2025-03-22 22:14:39.504553 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:14:39.505524 | orchestrator | 2025-03-22 22:14:39.505630 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-03-22 22:14:39.505911 | orchestrator | Saturday 22 March 2025 22:14:39 +0000 (0:00:00.147) 0:00:30.898 ******** 2025-03-22 22:14:39.698739 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'c38313fe-ae28-50de-b682-b60e7793463e'}}) 2025-03-22 22:14:39.698975 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'b5bb1ca0-c565-576e-8272-e005b053e8a9'}}) 2025-03-22 22:14:39.700111 | orchestrator | 2025-03-22 22:14:39.700748 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-03-22 22:14:39.701429 | orchestrator | Saturday 22 March 2025 22:14:39 +0000 (0:00:00.194) 0:00:31.092 ******** 2025-03-22 22:14:40.132116 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'c38313fe-ae28-50de-b682-b60e7793463e'}})  2025-03-22 22:14:40.132972 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'b5bb1ca0-c565-576e-8272-e005b053e8a9'}})  2025-03-22 22:14:40.133013 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:40.133775 | orchestrator | 2025-03-22 22:14:40.134247 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-03-22 22:14:40.134278 | orchestrator | Saturday 22 March 2025 22:14:40 +0000 (0:00:00.432) 0:00:31.524 ******** 2025-03-22 22:14:40.319641 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'c38313fe-ae28-50de-b682-b60e7793463e'}})  2025-03-22 22:14:40.320622 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'b5bb1ca0-c565-576e-8272-e005b053e8a9'}})  2025-03-22 22:14:40.321776 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:40.323621 | orchestrator | 2025-03-22 22:14:40.323814 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-03-22 22:14:40.324326 | orchestrator | Saturday 22 March 2025 22:14:40 +0000 (0:00:00.185) 0:00:31.710 ******** 2025-03-22 22:14:40.510811 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'c38313fe-ae28-50de-b682-b60e7793463e'}})  2025-03-22 22:14:40.512003 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'b5bb1ca0-c565-576e-8272-e005b053e8a9'}})  2025-03-22 22:14:40.513059 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:40.513402 | orchestrator | 2025-03-22 22:14:40.513430 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-03-22 22:14:40.514134 | orchestrator | Saturday 22 March 2025 22:14:40 +0000 (0:00:00.194) 0:00:31.904 ******** 2025-03-22 22:14:40.655133 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:14:40.655911 | orchestrator | 2025-03-22 22:14:40.656019 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-03-22 22:14:40.656329 | orchestrator | Saturday 22 March 2025 22:14:40 +0000 (0:00:00.144) 0:00:32.048 ******** 2025-03-22 22:14:40.827276 | orchestrator | ok: [testbed-node-4] 2025-03-22 22:14:40.827406 | orchestrator | 2025-03-22 22:14:40.828449 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-03-22 22:14:40.830984 | orchestrator | Saturday 22 March 2025 22:14:40 +0000 (0:00:00.170) 0:00:32.219 ******** 2025-03-22 22:14:40.971999 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:40.972658 | orchestrator | 2025-03-22 22:14:40.972699 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-03-22 22:14:40.973290 | orchestrator | Saturday 22 March 2025 22:14:40 +0000 (0:00:00.146) 0:00:32.365 ******** 2025-03-22 22:14:41.125717 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:41.127117 | orchestrator | 2025-03-22 22:14:41.127609 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-03-22 22:14:41.128791 | orchestrator | Saturday 22 March 2025 22:14:41 +0000 (0:00:00.153) 0:00:32.518 ******** 2025-03-22 22:14:41.295415 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:41.295870 | orchestrator | 2025-03-22 22:14:41.297462 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-03-22 22:14:41.298914 | orchestrator | Saturday 22 March 2025 22:14:41 +0000 (0:00:00.169) 0:00:32.687 ******** 2025-03-22 22:14:41.445414 | orchestrator | ok: [testbed-node-4] => { 2025-03-22 22:14:41.446128 | orchestrator |  "ceph_osd_devices": { 2025-03-22 22:14:41.447839 | orchestrator |  "sdb": { 2025-03-22 22:14:41.448137 | orchestrator |  "osd_lvm_uuid": "c38313fe-ae28-50de-b682-b60e7793463e" 2025-03-22 22:14:41.448931 | orchestrator |  }, 2025-03-22 22:14:41.449263 | orchestrator |  "sdc": { 2025-03-22 22:14:41.449371 | orchestrator |  "osd_lvm_uuid": "b5bb1ca0-c565-576e-8272-e005b053e8a9" 2025-03-22 22:14:41.450325 | orchestrator |  } 2025-03-22 22:14:41.451094 | orchestrator |  } 2025-03-22 22:14:41.451393 | orchestrator | } 2025-03-22 22:14:41.452108 | orchestrator | 2025-03-22 22:14:41.452518 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-03-22 22:14:41.453258 | orchestrator | Saturday 22 March 2025 22:14:41 +0000 (0:00:00.149) 0:00:32.836 ******** 2025-03-22 22:14:41.593462 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:41.593687 | orchestrator | 2025-03-22 22:14:41.594496 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-03-22 22:14:41.595012 | orchestrator | Saturday 22 March 2025 22:14:41 +0000 (0:00:00.150) 0:00:32.987 ******** 2025-03-22 22:14:41.744745 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:41.747779 | orchestrator | 2025-03-22 22:14:41.920034 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-03-22 22:14:41.920097 | orchestrator | Saturday 22 March 2025 22:14:41 +0000 (0:00:00.148) 0:00:33.135 ******** 2025-03-22 22:14:41.920146 | orchestrator | skipping: [testbed-node-4] 2025-03-22 22:14:41.920841 | orchestrator | 2025-03-22 22:14:41.921137 | orchestrator | TASK [Print configuration data] ************************************************ 2025-03-22 22:14:41.921840 | orchestrator | Saturday 22 March 2025 22:14:41 +0000 (0:00:00.177) 0:00:33.313 ******** 2025-03-22 22:14:42.479538 | orchestrator | changed: [testbed-node-4] => { 2025-03-22 22:14:42.479869 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-03-22 22:14:42.480933 | orchestrator |  "ceph_osd_devices": { 2025-03-22 22:14:42.481265 | orchestrator |  "sdb": { 2025-03-22 22:14:42.482814 | orchestrator |  "osd_lvm_uuid": "c38313fe-ae28-50de-b682-b60e7793463e" 2025-03-22 22:14:42.483267 | orchestrator |  }, 2025-03-22 22:14:42.483298 | orchestrator |  "sdc": { 2025-03-22 22:14:42.483725 | orchestrator |  "osd_lvm_uuid": "b5bb1ca0-c565-576e-8272-e005b053e8a9" 2025-03-22 22:14:42.484704 | orchestrator |  } 2025-03-22 22:14:42.485211 | orchestrator |  }, 2025-03-22 22:14:42.485973 | orchestrator |  "lvm_volumes": [ 2025-03-22 22:14:42.486212 | orchestrator |  { 2025-03-22 22:14:42.486893 | orchestrator |  "data": "osd-block-c38313fe-ae28-50de-b682-b60e7793463e", 2025-03-22 22:14:42.487505 | orchestrator |  "data_vg": "ceph-c38313fe-ae28-50de-b682-b60e7793463e" 2025-03-22 22:14:42.487904 | orchestrator |  }, 2025-03-22 22:14:42.488294 | orchestrator |  { 2025-03-22 22:14:42.488626 | orchestrator |  "data": "osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9", 2025-03-22 22:14:42.489201 | orchestrator |  "data_vg": "ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9" 2025-03-22 22:14:42.490085 | orchestrator |  } 2025-03-22 22:14:42.490233 | orchestrator |  ] 2025-03-22 22:14:42.490714 | orchestrator |  } 2025-03-22 22:14:42.491450 | orchestrator | } 2025-03-22 22:14:42.492152 | orchestrator | 2025-03-22 22:14:42.492787 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-03-22 22:14:42.493010 | orchestrator | Saturday 22 March 2025 22:14:42 +0000 (0:00:00.558) 0:00:33.871 ******** 2025-03-22 22:14:44.167132 | orchestrator | changed: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-03-22 22:14:44.167565 | orchestrator | 2025-03-22 22:14:44.168066 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-03-22 22:14:44.168855 | orchestrator | 2025-03-22 22:14:44.169375 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-22 22:14:44.169877 | orchestrator | Saturday 22 March 2025 22:14:44 +0000 (0:00:01.688) 0:00:35.560 ******** 2025-03-22 22:14:44.953701 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-03-22 22:14:44.954297 | orchestrator | 2025-03-22 22:14:44.955526 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-22 22:14:44.956374 | orchestrator | Saturday 22 March 2025 22:14:44 +0000 (0:00:00.786) 0:00:36.346 ******** 2025-03-22 22:14:45.217351 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:14:45.218413 | orchestrator | 2025-03-22 22:14:45.219671 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:45.221099 | orchestrator | Saturday 22 March 2025 22:14:45 +0000 (0:00:00.261) 0:00:36.608 ******** 2025-03-22 22:14:45.674063 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-03-22 22:14:45.674907 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-03-22 22:14:45.675411 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-03-22 22:14:45.676627 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-03-22 22:14:45.677369 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-03-22 22:14:45.678154 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-03-22 22:14:45.679969 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-03-22 22:14:45.680044 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-03-22 22:14:45.680064 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-03-22 22:14:45.680085 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-03-22 22:14:45.680942 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-03-22 22:14:45.681796 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-03-22 22:14:45.682591 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-03-22 22:14:45.682676 | orchestrator | 2025-03-22 22:14:45.683659 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:45.684460 | orchestrator | Saturday 22 March 2025 22:14:45 +0000 (0:00:00.457) 0:00:37.066 ******** 2025-03-22 22:14:45.895510 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:45.896370 | orchestrator | 2025-03-22 22:14:45.896497 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:45.896524 | orchestrator | Saturday 22 March 2025 22:14:45 +0000 (0:00:00.223) 0:00:37.289 ******** 2025-03-22 22:14:46.111576 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:46.115004 | orchestrator | 2025-03-22 22:14:46.353347 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:46.353438 | orchestrator | Saturday 22 March 2025 22:14:46 +0000 (0:00:00.212) 0:00:37.502 ******** 2025-03-22 22:14:46.353466 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:46.353817 | orchestrator | 2025-03-22 22:14:46.354010 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:46.354908 | orchestrator | Saturday 22 March 2025 22:14:46 +0000 (0:00:00.244) 0:00:37.746 ******** 2025-03-22 22:14:46.573074 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:46.573284 | orchestrator | 2025-03-22 22:14:46.573336 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:46.573386 | orchestrator | Saturday 22 March 2025 22:14:46 +0000 (0:00:00.218) 0:00:37.965 ******** 2025-03-22 22:14:46.789091 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:46.789718 | orchestrator | 2025-03-22 22:14:46.790602 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:46.790941 | orchestrator | Saturday 22 March 2025 22:14:46 +0000 (0:00:00.217) 0:00:38.182 ******** 2025-03-22 22:14:47.032422 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:47.033197 | orchestrator | 2025-03-22 22:14:47.033945 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:47.034670 | orchestrator | Saturday 22 March 2025 22:14:47 +0000 (0:00:00.243) 0:00:38.425 ******** 2025-03-22 22:14:47.242905 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:47.243279 | orchestrator | 2025-03-22 22:14:47.244087 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:47.245080 | orchestrator | Saturday 22 March 2025 22:14:47 +0000 (0:00:00.209) 0:00:38.635 ******** 2025-03-22 22:14:47.717235 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:47.718007 | orchestrator | 2025-03-22 22:14:47.718287 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:47.718948 | orchestrator | Saturday 22 March 2025 22:14:47 +0000 (0:00:00.473) 0:00:39.109 ******** 2025-03-22 22:14:48.194807 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_b23a3080-52c6-488b-8c5e-d7619a688699) 2025-03-22 22:14:48.195979 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_b23a3080-52c6-488b-8c5e-d7619a688699) 2025-03-22 22:14:48.196363 | orchestrator | 2025-03-22 22:14:48.196876 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:48.197328 | orchestrator | Saturday 22 March 2025 22:14:48 +0000 (0:00:00.479) 0:00:39.588 ******** 2025-03-22 22:14:48.675925 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_681c10dc-f1f8-4703-92fb-54cdfa604000) 2025-03-22 22:14:48.676615 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_681c10dc-f1f8-4703-92fb-54cdfa604000) 2025-03-22 22:14:48.677991 | orchestrator | 2025-03-22 22:14:48.680094 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:49.175740 | orchestrator | Saturday 22 March 2025 22:14:48 +0000 (0:00:00.479) 0:00:40.068 ******** 2025-03-22 22:14:49.175880 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_0f58cf45-cc6c-41c9-84ae-96e36ead1340) 2025-03-22 22:14:49.176481 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_0f58cf45-cc6c-41c9-84ae-96e36ead1340) 2025-03-22 22:14:49.176618 | orchestrator | 2025-03-22 22:14:49.177455 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:49.177882 | orchestrator | Saturday 22 March 2025 22:14:49 +0000 (0:00:00.501) 0:00:40.569 ******** 2025-03-22 22:14:49.649255 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_9647d6dd-7a49-4e0c-bf6a-16b92e91fe66) 2025-03-22 22:14:49.650499 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_9647d6dd-7a49-4e0c-bf6a-16b92e91fe66) 2025-03-22 22:14:49.651523 | orchestrator | 2025-03-22 22:14:49.652217 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 22:14:49.655031 | orchestrator | Saturday 22 March 2025 22:14:49 +0000 (0:00:00.472) 0:00:41.041 ******** 2025-03-22 22:14:50.023279 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-22 22:14:50.023490 | orchestrator | 2025-03-22 22:14:50.024364 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:50.024802 | orchestrator | Saturday 22 March 2025 22:14:50 +0000 (0:00:00.374) 0:00:41.416 ******** 2025-03-22 22:14:50.504883 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-03-22 22:14:50.505524 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-03-22 22:14:50.505563 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-03-22 22:14:50.506228 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-03-22 22:14:50.507164 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-03-22 22:14:50.509861 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-03-22 22:14:50.509947 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-03-22 22:14:50.510116 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-03-22 22:14:50.510143 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-03-22 22:14:50.510226 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-03-22 22:14:50.510243 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-03-22 22:14:50.510263 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-03-22 22:14:50.510944 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-03-22 22:14:50.511411 | orchestrator | 2025-03-22 22:14:50.512067 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:50.512484 | orchestrator | Saturday 22 March 2025 22:14:50 +0000 (0:00:00.481) 0:00:41.897 ******** 2025-03-22 22:14:50.768622 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:50.769354 | orchestrator | 2025-03-22 22:14:50.770346 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:50.771078 | orchestrator | Saturday 22 March 2025 22:14:50 +0000 (0:00:00.263) 0:00:42.161 ******** 2025-03-22 22:14:51.003139 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:51.004136 | orchestrator | 2025-03-22 22:14:51.004391 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:51.421153 | orchestrator | Saturday 22 March 2025 22:14:50 +0000 (0:00:00.235) 0:00:42.396 ******** 2025-03-22 22:14:51.421308 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:51.421388 | orchestrator | 2025-03-22 22:14:51.421668 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:51.422322 | orchestrator | Saturday 22 March 2025 22:14:51 +0000 (0:00:00.418) 0:00:42.815 ******** 2025-03-22 22:14:51.675552 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:51.676237 | orchestrator | 2025-03-22 22:14:51.676659 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:51.677473 | orchestrator | Saturday 22 March 2025 22:14:51 +0000 (0:00:00.253) 0:00:43.068 ******** 2025-03-22 22:14:51.916853 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:51.916987 | orchestrator | 2025-03-22 22:14:51.918156 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:51.918297 | orchestrator | Saturday 22 March 2025 22:14:51 +0000 (0:00:00.242) 0:00:43.310 ******** 2025-03-22 22:14:52.130409 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:52.130545 | orchestrator | 2025-03-22 22:14:52.131336 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:52.131370 | orchestrator | Saturday 22 March 2025 22:14:52 +0000 (0:00:00.213) 0:00:43.523 ******** 2025-03-22 22:14:52.352616 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:52.353680 | orchestrator | 2025-03-22 22:14:52.354078 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:52.354241 | orchestrator | Saturday 22 March 2025 22:14:52 +0000 (0:00:00.222) 0:00:43.746 ******** 2025-03-22 22:14:52.562618 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:52.563018 | orchestrator | 2025-03-22 22:14:52.563057 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:52.563368 | orchestrator | Saturday 22 March 2025 22:14:52 +0000 (0:00:00.208) 0:00:43.955 ******** 2025-03-22 22:14:53.280622 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-03-22 22:14:53.281815 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-03-22 22:14:53.281848 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-03-22 22:14:53.282326 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-03-22 22:14:53.282944 | orchestrator | 2025-03-22 22:14:53.282973 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:53.511557 | orchestrator | Saturday 22 March 2025 22:14:53 +0000 (0:00:00.718) 0:00:44.673 ******** 2025-03-22 22:14:53.511632 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:53.512622 | orchestrator | 2025-03-22 22:14:53.513495 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:53.514725 | orchestrator | Saturday 22 March 2025 22:14:53 +0000 (0:00:00.230) 0:00:44.904 ******** 2025-03-22 22:14:53.746936 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:53.747958 | orchestrator | 2025-03-22 22:14:53.748627 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:53.749366 | orchestrator | Saturday 22 March 2025 22:14:53 +0000 (0:00:00.235) 0:00:45.140 ******** 2025-03-22 22:14:53.962236 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:53.963483 | orchestrator | 2025-03-22 22:14:53.964475 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 22:14:53.965284 | orchestrator | Saturday 22 March 2025 22:14:53 +0000 (0:00:00.213) 0:00:45.354 ******** 2025-03-22 22:14:54.190712 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:54.190877 | orchestrator | 2025-03-22 22:14:54.191211 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-03-22 22:14:54.191395 | orchestrator | Saturday 22 March 2025 22:14:54 +0000 (0:00:00.230) 0:00:45.584 ******** 2025-03-22 22:14:54.637329 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': None}) 2025-03-22 22:14:54.637690 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': None}) 2025-03-22 22:14:54.637720 | orchestrator | 2025-03-22 22:14:54.637735 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-03-22 22:14:54.637756 | orchestrator | Saturday 22 March 2025 22:14:54 +0000 (0:00:00.443) 0:00:46.028 ******** 2025-03-22 22:14:54.777747 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:54.929396 | orchestrator | 2025-03-22 22:14:54.929471 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-03-22 22:14:54.929488 | orchestrator | Saturday 22 March 2025 22:14:54 +0000 (0:00:00.141) 0:00:46.170 ******** 2025-03-22 22:14:54.929514 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:54.929579 | orchestrator | 2025-03-22 22:14:54.930638 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-03-22 22:14:54.931502 | orchestrator | Saturday 22 March 2025 22:14:54 +0000 (0:00:00.150) 0:00:46.320 ******** 2025-03-22 22:14:55.092704 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:55.093379 | orchestrator | 2025-03-22 22:14:55.094277 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-03-22 22:14:55.095296 | orchestrator | Saturday 22 March 2025 22:14:55 +0000 (0:00:00.165) 0:00:46.486 ******** 2025-03-22 22:14:55.274243 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:14:55.275412 | orchestrator | 2025-03-22 22:14:55.276228 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-03-22 22:14:55.276753 | orchestrator | Saturday 22 March 2025 22:14:55 +0000 (0:00:00.181) 0:00:46.668 ******** 2025-03-22 22:14:55.482897 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'b06d94ac-41ed-5dd0-ab65-3af10e523950'}}) 2025-03-22 22:14:55.483590 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'f9e87843-61b1-54ad-82c0-8d76e40ccf36'}}) 2025-03-22 22:14:55.484668 | orchestrator | 2025-03-22 22:14:55.485504 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-03-22 22:14:55.486343 | orchestrator | Saturday 22 March 2025 22:14:55 +0000 (0:00:00.208) 0:00:46.876 ******** 2025-03-22 22:14:55.653569 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'b06d94ac-41ed-5dd0-ab65-3af10e523950'}})  2025-03-22 22:14:55.655387 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'f9e87843-61b1-54ad-82c0-8d76e40ccf36'}})  2025-03-22 22:14:55.655821 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:55.658427 | orchestrator | 2025-03-22 22:14:55.829615 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-03-22 22:14:55.829719 | orchestrator | Saturday 22 March 2025 22:14:55 +0000 (0:00:00.169) 0:00:47.046 ******** 2025-03-22 22:14:55.829761 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'b06d94ac-41ed-5dd0-ab65-3af10e523950'}})  2025-03-22 22:14:55.830553 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'f9e87843-61b1-54ad-82c0-8d76e40ccf36'}})  2025-03-22 22:14:55.830607 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:55.830630 | orchestrator | 2025-03-22 22:14:55.830665 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-03-22 22:14:55.831447 | orchestrator | Saturday 22 March 2025 22:14:55 +0000 (0:00:00.174) 0:00:47.220 ******** 2025-03-22 22:14:56.031295 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'b06d94ac-41ed-5dd0-ab65-3af10e523950'}})  2025-03-22 22:14:56.167815 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'f9e87843-61b1-54ad-82c0-8d76e40ccf36'}})  2025-03-22 22:14:56.167923 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:56.167943 | orchestrator | 2025-03-22 22:14:56.167958 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-03-22 22:14:56.167974 | orchestrator | Saturday 22 March 2025 22:14:56 +0000 (0:00:00.202) 0:00:47.423 ******** 2025-03-22 22:14:56.168003 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:14:56.169152 | orchestrator | 2025-03-22 22:14:56.169608 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-03-22 22:14:56.172822 | orchestrator | Saturday 22 March 2025 22:14:56 +0000 (0:00:00.137) 0:00:47.560 ******** 2025-03-22 22:14:56.325777 | orchestrator | ok: [testbed-node-5] 2025-03-22 22:14:56.326522 | orchestrator | 2025-03-22 22:14:56.326972 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-03-22 22:14:56.327603 | orchestrator | Saturday 22 March 2025 22:14:56 +0000 (0:00:00.159) 0:00:47.719 ******** 2025-03-22 22:14:56.470198 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:56.470953 | orchestrator | 2025-03-22 22:14:56.470989 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-03-22 22:14:56.471318 | orchestrator | Saturday 22 March 2025 22:14:56 +0000 (0:00:00.143) 0:00:47.863 ******** 2025-03-22 22:14:56.837604 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:56.838550 | orchestrator | 2025-03-22 22:14:56.840115 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-03-22 22:14:56.841231 | orchestrator | Saturday 22 March 2025 22:14:56 +0000 (0:00:00.364) 0:00:48.228 ******** 2025-03-22 22:14:56.996381 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:56.999915 | orchestrator | 2025-03-22 22:14:57.145053 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-03-22 22:14:57.145095 | orchestrator | Saturday 22 March 2025 22:14:56 +0000 (0:00:00.158) 0:00:48.386 ******** 2025-03-22 22:14:57.145119 | orchestrator | ok: [testbed-node-5] => { 2025-03-22 22:14:57.149994 | orchestrator |  "ceph_osd_devices": { 2025-03-22 22:14:57.150076 | orchestrator |  "sdb": { 2025-03-22 22:14:57.152104 | orchestrator |  "osd_lvm_uuid": "b06d94ac-41ed-5dd0-ab65-3af10e523950" 2025-03-22 22:14:57.152131 | orchestrator |  }, 2025-03-22 22:14:57.152153 | orchestrator |  "sdc": { 2025-03-22 22:14:57.153827 | orchestrator |  "osd_lvm_uuid": "f9e87843-61b1-54ad-82c0-8d76e40ccf36" 2025-03-22 22:14:57.154791 | orchestrator |  } 2025-03-22 22:14:57.155398 | orchestrator |  } 2025-03-22 22:14:57.156206 | orchestrator | } 2025-03-22 22:14:57.156798 | orchestrator | 2025-03-22 22:14:57.157977 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-03-22 22:14:57.158914 | orchestrator | Saturday 22 March 2025 22:14:57 +0000 (0:00:00.150) 0:00:48.537 ******** 2025-03-22 22:14:57.293975 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:57.294646 | orchestrator | 2025-03-22 22:14:57.295284 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-03-22 22:14:57.296930 | orchestrator | Saturday 22 March 2025 22:14:57 +0000 (0:00:00.151) 0:00:48.688 ******** 2025-03-22 22:14:57.445647 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:57.446508 | orchestrator | 2025-03-22 22:14:57.447898 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-03-22 22:14:57.447956 | orchestrator | Saturday 22 March 2025 22:14:57 +0000 (0:00:00.150) 0:00:48.839 ******** 2025-03-22 22:14:57.594077 | orchestrator | skipping: [testbed-node-5] 2025-03-22 22:14:57.595025 | orchestrator | 2025-03-22 22:14:57.596989 | orchestrator | TASK [Print configuration data] ************************************************ 2025-03-22 22:14:57.597196 | orchestrator | Saturday 22 March 2025 22:14:57 +0000 (0:00:00.147) 0:00:48.986 ******** 2025-03-22 22:14:57.893814 | orchestrator | changed: [testbed-node-5] => { 2025-03-22 22:14:57.894561 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-03-22 22:14:57.894596 | orchestrator |  "ceph_osd_devices": { 2025-03-22 22:14:57.894901 | orchestrator |  "sdb": { 2025-03-22 22:14:57.895293 | orchestrator |  "osd_lvm_uuid": "b06d94ac-41ed-5dd0-ab65-3af10e523950" 2025-03-22 22:14:57.895583 | orchestrator |  }, 2025-03-22 22:14:57.896332 | orchestrator |  "sdc": { 2025-03-22 22:14:57.896723 | orchestrator |  "osd_lvm_uuid": "f9e87843-61b1-54ad-82c0-8d76e40ccf36" 2025-03-22 22:14:57.897575 | orchestrator |  } 2025-03-22 22:14:57.897903 | orchestrator |  }, 2025-03-22 22:14:57.898652 | orchestrator |  "lvm_volumes": [ 2025-03-22 22:14:57.898721 | orchestrator |  { 2025-03-22 22:14:57.899914 | orchestrator |  "data": "osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950", 2025-03-22 22:14:57.900259 | orchestrator |  "data_vg": "ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950" 2025-03-22 22:14:57.901107 | orchestrator |  }, 2025-03-22 22:14:57.901738 | orchestrator |  { 2025-03-22 22:14:57.902158 | orchestrator |  "data": "osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36", 2025-03-22 22:14:57.904411 | orchestrator |  "data_vg": "ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36" 2025-03-22 22:14:57.904759 | orchestrator |  } 2025-03-22 22:14:57.905215 | orchestrator |  ] 2025-03-22 22:14:57.905832 | orchestrator |  } 2025-03-22 22:14:57.906283 | orchestrator | } 2025-03-22 22:14:57.906952 | orchestrator | 2025-03-22 22:14:57.907318 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-03-22 22:14:57.907961 | orchestrator | Saturday 22 March 2025 22:14:57 +0000 (0:00:00.299) 0:00:49.286 ******** 2025-03-22 22:14:59.450609 | orchestrator | changed: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-03-22 22:14:59.451403 | orchestrator | 2025-03-22 22:14:59.452068 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 22:14:59.452114 | orchestrator | 2025-03-22 22:14:59 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 22:14:59.452872 | orchestrator | 2025-03-22 22:14:59 | INFO  | Please wait and do not abort execution. 2025-03-22 22:14:59.452907 | orchestrator | testbed-node-3 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-03-22 22:14:59.453347 | orchestrator | testbed-node-4 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-03-22 22:14:59.455432 | orchestrator | testbed-node-5 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-03-22 22:14:59.459833 | orchestrator | 2025-03-22 22:14:59.461204 | orchestrator | 2025-03-22 22:14:59.461236 | orchestrator | 2025-03-22 22:14:59.461257 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 22:14:59.465758 | orchestrator | Saturday 22 March 2025 22:14:59 +0000 (0:00:01.557) 0:00:50.843 ******** 2025-03-22 22:14:59.467670 | orchestrator | =============================================================================== 2025-03-22 22:14:59.469470 | orchestrator | Write configuration file ------------------------------------------------ 5.68s 2025-03-22 22:14:59.469520 | orchestrator | Add known links to the list of available block devices ------------------ 1.70s 2025-03-22 22:14:59.474758 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 1.58s 2025-03-22 22:14:59.476040 | orchestrator | Add known partitions to the list of available block devices ------------- 1.47s 2025-03-22 22:14:59.476069 | orchestrator | Add known partitions to the list of available block devices ------------- 1.23s 2025-03-22 22:14:59.476904 | orchestrator | Print configuration data ------------------------------------------------ 1.19s 2025-03-22 22:14:59.479260 | orchestrator | Add known links to the list of available block devices ------------------ 0.98s 2025-03-22 22:14:59.480303 | orchestrator | Get initial list of available block devices ----------------------------- 0.93s 2025-03-22 22:14:59.483344 | orchestrator | Generate lvm_volumes structure (block + db) ----------------------------- 0.91s 2025-03-22 22:14:59.484262 | orchestrator | Set UUIDs for OSD VGs/LVs ----------------------------------------------- 0.90s 2025-03-22 22:14:59.484761 | orchestrator | Add known links to the list of available block devices ------------------ 0.85s 2025-03-22 22:14:59.485340 | orchestrator | Add known partitions to the list of available block devices ------------- 0.83s 2025-03-22 22:14:59.486088 | orchestrator | Add known partitions to the list of available block devices ------------- 0.75s 2025-03-22 22:14:59.486463 | orchestrator | Generate lvm_volumes structure (block only) ----------------------------- 0.73s 2025-03-22 22:14:59.486911 | orchestrator | Set DB+WAL devices config data ------------------------------------------ 0.72s 2025-03-22 22:14:59.487360 | orchestrator | Add known partitions to the list of available block devices ------------- 0.72s 2025-03-22 22:14:59.487846 | orchestrator | Add known links to the list of available block devices ------------------ 0.71s 2025-03-22 22:14:59.488300 | orchestrator | Add known links to the list of available block devices ------------------ 0.69s 2025-03-22 22:14:59.488747 | orchestrator | Generate WAL VG names --------------------------------------------------- 0.68s 2025-03-22 22:14:59.489248 | orchestrator | Set WAL devices config data --------------------------------------------- 0.68s 2025-03-22 22:15:11.818402 | orchestrator | 2025-03-22 22:15:11 | INFO  | Task a4ef5871-bcde-468e-8f19-0dfbb819c45b is running in background. Output coming soon. 2025-03-22 23:15:14.598179 | orchestrator | 2025-03-22 23:15:14 | INFO  | Task 0c7467b5-6d56-429e-92f6-9de54e2ae969 (ceph-create-lvm-devices) was prepared for execution. 2025-03-22 23:15:18.229925 | orchestrator | 2025-03-22 23:15:14 | INFO  | It takes a moment until task 0c7467b5-6d56-429e-92f6-9de54e2ae969 (ceph-create-lvm-devices) has been started and output is visible here. 2025-03-22 23:15:18.230076 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.12 2025-03-22 23:15:18.795268 | orchestrator | 2025-03-22 23:15:18.796238 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-03-22 23:15:18.797230 | orchestrator | 2025-03-22 23:15:18.798617 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-22 23:15:18.799559 | orchestrator | Saturday 22 March 2025 23:15:18 +0000 (0:00:00.490) 0:00:00.490 ******** 2025-03-22 23:15:19.044168 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-03-22 23:15:19.044744 | orchestrator | 2025-03-22 23:15:19.044781 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-22 23:15:19.045038 | orchestrator | Saturday 22 March 2025 23:15:19 +0000 (0:00:00.250) 0:00:00.740 ******** 2025-03-22 23:15:19.317822 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:20.154745 | orchestrator | 2025-03-22 23:15:20.154845 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:20.154863 | orchestrator | Saturday 22 March 2025 23:15:19 +0000 (0:00:00.271) 0:00:01.012 ******** 2025-03-22 23:15:20.154890 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-03-22 23:15:20.155214 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-03-22 23:15:20.156646 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-03-22 23:15:20.157297 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-03-22 23:15:20.158959 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-03-22 23:15:20.160434 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-03-22 23:15:20.160594 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-03-22 23:15:20.161768 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-03-22 23:15:20.162747 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-03-22 23:15:20.166927 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-03-22 23:15:20.169311 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-03-22 23:15:20.169376 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-03-22 23:15:20.170459 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-03-22 23:15:20.172749 | orchestrator | 2025-03-22 23:15:20.173363 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:20.173916 | orchestrator | Saturday 22 March 2025 23:15:20 +0000 (0:00:00.838) 0:00:01.851 ******** 2025-03-22 23:15:20.385048 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:20.385178 | orchestrator | 2025-03-22 23:15:20.385992 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:20.388461 | orchestrator | Saturday 22 March 2025 23:15:20 +0000 (0:00:00.227) 0:00:02.079 ******** 2025-03-22 23:15:20.635775 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:20.636449 | orchestrator | 2025-03-22 23:15:20.636765 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:20.637518 | orchestrator | Saturday 22 March 2025 23:15:20 +0000 (0:00:00.253) 0:00:02.332 ******** 2025-03-22 23:15:20.844636 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:20.846197 | orchestrator | 2025-03-22 23:15:20.846231 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:20.847250 | orchestrator | Saturday 22 March 2025 23:15:20 +0000 (0:00:00.207) 0:00:02.540 ******** 2025-03-22 23:15:21.042200 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:21.043031 | orchestrator | 2025-03-22 23:15:21.044583 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:21.047490 | orchestrator | Saturday 22 March 2025 23:15:21 +0000 (0:00:00.198) 0:00:02.739 ******** 2025-03-22 23:15:21.270080 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:21.271109 | orchestrator | 2025-03-22 23:15:21.271752 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:21.274670 | orchestrator | Saturday 22 March 2025 23:15:21 +0000 (0:00:00.226) 0:00:02.965 ******** 2025-03-22 23:15:21.491779 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:21.492256 | orchestrator | 2025-03-22 23:15:21.493074 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:21.493401 | orchestrator | Saturday 22 March 2025 23:15:21 +0000 (0:00:00.221) 0:00:03.187 ******** 2025-03-22 23:15:21.711893 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:21.714965 | orchestrator | 2025-03-22 23:15:21.715510 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:21.715810 | orchestrator | Saturday 22 March 2025 23:15:21 +0000 (0:00:00.220) 0:00:03.408 ******** 2025-03-22 23:15:21.915678 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:21.916271 | orchestrator | 2025-03-22 23:15:21.917907 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:21.918236 | orchestrator | Saturday 22 March 2025 23:15:21 +0000 (0:00:00.203) 0:00:03.611 ******** 2025-03-22 23:15:22.612629 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_f0064d2e-b937-487e-87eb-c5dccc0148b3) 2025-03-22 23:15:22.613086 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_f0064d2e-b937-487e-87eb-c5dccc0148b3) 2025-03-22 23:15:22.615154 | orchestrator | 2025-03-22 23:15:22.617209 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:22.618438 | orchestrator | Saturday 22 March 2025 23:15:22 +0000 (0:00:00.697) 0:00:04.308 ******** 2025-03-22 23:15:23.323883 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_637f4fb2-465d-4aa6-a08d-716b8ef59fde) 2025-03-22 23:15:23.325772 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_637f4fb2-465d-4aa6-a08d-716b8ef59fde) 2025-03-22 23:15:23.328548 | orchestrator | 2025-03-22 23:15:23.781588 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:23.781688 | orchestrator | Saturday 22 March 2025 23:15:23 +0000 (0:00:00.712) 0:00:05.020 ******** 2025-03-22 23:15:23.781719 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_961657b8-7922-4be7-b7ea-8a6546d88057) 2025-03-22 23:15:23.782980 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_961657b8-7922-4be7-b7ea-8a6546d88057) 2025-03-22 23:15:23.784137 | orchestrator | 2025-03-22 23:15:23.784917 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:23.785573 | orchestrator | Saturday 22 March 2025 23:15:23 +0000 (0:00:00.457) 0:00:05.478 ******** 2025-03-22 23:15:24.289021 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_ffd44099-c386-47c0-8dc0-30cf9a71e0b5) 2025-03-22 23:15:24.289464 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_ffd44099-c386-47c0-8dc0-30cf9a71e0b5) 2025-03-22 23:15:24.289543 | orchestrator | 2025-03-22 23:15:24.290269 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:24.290575 | orchestrator | Saturday 22 March 2025 23:15:24 +0000 (0:00:00.508) 0:00:05.986 ******** 2025-03-22 23:15:24.657294 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-22 23:15:24.657744 | orchestrator | 2025-03-22 23:15:24.658881 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:24.659296 | orchestrator | Saturday 22 March 2025 23:15:24 +0000 (0:00:00.365) 0:00:06.352 ******** 2025-03-22 23:15:25.166090 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-03-22 23:15:25.166766 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-03-22 23:15:25.166990 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-03-22 23:15:25.168603 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-03-22 23:15:25.169417 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-03-22 23:15:25.170101 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-03-22 23:15:25.170925 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-03-22 23:15:25.171630 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-03-22 23:15:25.172136 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-03-22 23:15:25.172776 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-03-22 23:15:25.174379 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-03-22 23:15:25.175024 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-03-22 23:15:25.175994 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-03-22 23:15:25.176215 | orchestrator | 2025-03-22 23:15:25.177166 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:25.408818 | orchestrator | Saturday 22 March 2025 23:15:25 +0000 (0:00:00.510) 0:00:06.862 ******** 2025-03-22 23:15:25.408962 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:25.409840 | orchestrator | 2025-03-22 23:15:25.410582 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:25.411352 | orchestrator | Saturday 22 March 2025 23:15:25 +0000 (0:00:00.242) 0:00:07.104 ******** 2025-03-22 23:15:25.626241 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:25.626588 | orchestrator | 2025-03-22 23:15:25.626624 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:25.627363 | orchestrator | Saturday 22 March 2025 23:15:25 +0000 (0:00:00.218) 0:00:07.323 ******** 2025-03-22 23:15:25.836225 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:25.836338 | orchestrator | 2025-03-22 23:15:25.837287 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:25.838431 | orchestrator | Saturday 22 March 2025 23:15:25 +0000 (0:00:00.209) 0:00:07.532 ******** 2025-03-22 23:15:26.081044 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:26.081794 | orchestrator | 2025-03-22 23:15:26.082704 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:26.083394 | orchestrator | Saturday 22 March 2025 23:15:26 +0000 (0:00:00.244) 0:00:07.777 ******** 2025-03-22 23:15:26.718080 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:26.719142 | orchestrator | 2025-03-22 23:15:26.721575 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:26.919362 | orchestrator | Saturday 22 March 2025 23:15:26 +0000 (0:00:00.636) 0:00:08.413 ******** 2025-03-22 23:15:26.919520 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:26.920504 | orchestrator | 2025-03-22 23:15:26.922382 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:26.922948 | orchestrator | Saturday 22 March 2025 23:15:26 +0000 (0:00:00.200) 0:00:08.614 ******** 2025-03-22 23:15:27.135726 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:27.136566 | orchestrator | 2025-03-22 23:15:27.137402 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:27.138573 | orchestrator | Saturday 22 March 2025 23:15:27 +0000 (0:00:00.218) 0:00:08.832 ******** 2025-03-22 23:15:27.343870 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:27.343985 | orchestrator | 2025-03-22 23:15:27.344387 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:27.344414 | orchestrator | Saturday 22 March 2025 23:15:27 +0000 (0:00:00.208) 0:00:09.040 ******** 2025-03-22 23:15:28.125934 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-03-22 23:15:28.126887 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-03-22 23:15:28.128399 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-03-22 23:15:28.128566 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-03-22 23:15:28.128818 | orchestrator | 2025-03-22 23:15:28.128849 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:28.129697 | orchestrator | Saturday 22 March 2025 23:15:28 +0000 (0:00:00.781) 0:00:09.822 ******** 2025-03-22 23:15:28.351877 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:28.352752 | orchestrator | 2025-03-22 23:15:28.353375 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:28.353433 | orchestrator | Saturday 22 March 2025 23:15:28 +0000 (0:00:00.227) 0:00:10.049 ******** 2025-03-22 23:15:28.554190 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:28.554700 | orchestrator | 2025-03-22 23:15:28.555126 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:28.555595 | orchestrator | Saturday 22 March 2025 23:15:28 +0000 (0:00:00.200) 0:00:10.249 ******** 2025-03-22 23:15:28.778112 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:28.780214 | orchestrator | 2025-03-22 23:15:28.780844 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:28.781716 | orchestrator | Saturday 22 March 2025 23:15:28 +0000 (0:00:00.223) 0:00:10.473 ******** 2025-03-22 23:15:29.033892 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:29.035233 | orchestrator | 2025-03-22 23:15:29.036432 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-03-22 23:15:29.037506 | orchestrator | Saturday 22 March 2025 23:15:29 +0000 (0:00:00.256) 0:00:10.729 ******** 2025-03-22 23:15:29.197446 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:29.197634 | orchestrator | 2025-03-22 23:15:29.198457 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-03-22 23:15:29.198790 | orchestrator | Saturday 22 March 2025 23:15:29 +0000 (0:00:00.164) 0:00:10.894 ******** 2025-03-22 23:15:29.639516 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '43a113f0-cd75-588c-85b3-7699e063bb3b'}}) 2025-03-22 23:15:29.641005 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'e04c708b-d456-5156-8bd3-75c09a375fc5'}}) 2025-03-22 23:15:29.641035 | orchestrator | 2025-03-22 23:15:29.641995 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-03-22 23:15:29.643276 | orchestrator | Saturday 22 March 2025 23:15:29 +0000 (0:00:00.437) 0:00:11.331 ******** 2025-03-22 23:15:32.062374 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'}) 2025-03-22 23:15:32.063503 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'}) 2025-03-22 23:15:32.064337 | orchestrator | 2025-03-22 23:15:32.064370 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-03-22 23:15:32.064701 | orchestrator | Saturday 22 March 2025 23:15:32 +0000 (0:00:02.426) 0:00:13.758 ******** 2025-03-22 23:15:32.247144 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:32.249069 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:32.249458 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:32.250710 | orchestrator | 2025-03-22 23:15:32.251603 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-03-22 23:15:32.252381 | orchestrator | Saturday 22 March 2025 23:15:32 +0000 (0:00:00.183) 0:00:13.942 ******** 2025-03-22 23:15:33.806707 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'}) 2025-03-22 23:15:33.807059 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'}) 2025-03-22 23:15:33.807828 | orchestrator | 2025-03-22 23:15:33.808837 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-03-22 23:15:33.810624 | orchestrator | Saturday 22 March 2025 23:15:33 +0000 (0:00:01.559) 0:00:15.501 ******** 2025-03-22 23:15:33.994935 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:33.995898 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:33.996246 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:33.997246 | orchestrator | 2025-03-22 23:15:33.998089 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-03-22 23:15:33.998560 | orchestrator | Saturday 22 March 2025 23:15:33 +0000 (0:00:00.190) 0:00:15.692 ******** 2025-03-22 23:15:34.168751 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:34.169118 | orchestrator | 2025-03-22 23:15:34.169683 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-03-22 23:15:34.170148 | orchestrator | Saturday 22 March 2025 23:15:34 +0000 (0:00:00.172) 0:00:15.864 ******** 2025-03-22 23:15:34.348210 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:34.350227 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:34.350310 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:34.350329 | orchestrator | 2025-03-22 23:15:34.350345 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-03-22 23:15:34.350362 | orchestrator | Saturday 22 March 2025 23:15:34 +0000 (0:00:00.181) 0:00:16.045 ******** 2025-03-22 23:15:34.490315 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:34.491094 | orchestrator | 2025-03-22 23:15:34.491124 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-03-22 23:15:34.658722 | orchestrator | Saturday 22 March 2025 23:15:34 +0000 (0:00:00.140) 0:00:16.186 ******** 2025-03-22 23:15:34.658769 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:34.659756 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:34.659780 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:34.659799 | orchestrator | 2025-03-22 23:15:34.660152 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-03-22 23:15:34.660634 | orchestrator | Saturday 22 March 2025 23:15:34 +0000 (0:00:00.169) 0:00:16.355 ******** 2025-03-22 23:15:35.009829 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:35.010721 | orchestrator | 2025-03-22 23:15:35.011704 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-03-22 23:15:35.012699 | orchestrator | Saturday 22 March 2025 23:15:35 +0000 (0:00:00.350) 0:00:16.706 ******** 2025-03-22 23:15:35.202078 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:35.204345 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:35.204812 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:35.206125 | orchestrator | 2025-03-22 23:15:35.206921 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-03-22 23:15:35.207603 | orchestrator | Saturday 22 March 2025 23:15:35 +0000 (0:00:00.190) 0:00:16.897 ******** 2025-03-22 23:15:35.366460 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:35.366917 | orchestrator | 2025-03-22 23:15:35.367925 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-03-22 23:15:35.368540 | orchestrator | Saturday 22 March 2025 23:15:35 +0000 (0:00:00.164) 0:00:17.061 ******** 2025-03-22 23:15:35.554173 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:35.555377 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:35.556437 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:35.556789 | orchestrator | 2025-03-22 23:15:35.557825 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-03-22 23:15:35.558535 | orchestrator | Saturday 22 March 2025 23:15:35 +0000 (0:00:00.186) 0:00:17.248 ******** 2025-03-22 23:15:35.717257 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:35.718706 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:35.719453 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:35.720384 | orchestrator | 2025-03-22 23:15:35.721588 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-03-22 23:15:35.722416 | orchestrator | Saturday 22 March 2025 23:15:35 +0000 (0:00:00.165) 0:00:17.413 ******** 2025-03-22 23:15:35.891622 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:35.891765 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:35.892049 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:35.892574 | orchestrator | 2025-03-22 23:15:35.893140 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-03-22 23:15:35.893791 | orchestrator | Saturday 22 March 2025 23:15:35 +0000 (0:00:00.174) 0:00:17.588 ******** 2025-03-22 23:15:36.038916 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:36.039573 | orchestrator | 2025-03-22 23:15:36.039609 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-03-22 23:15:36.040345 | orchestrator | Saturday 22 March 2025 23:15:36 +0000 (0:00:00.147) 0:00:17.735 ******** 2025-03-22 23:15:36.189713 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:36.190387 | orchestrator | 2025-03-22 23:15:36.191843 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-03-22 23:15:36.196955 | orchestrator | Saturday 22 March 2025 23:15:36 +0000 (0:00:00.149) 0:00:17.885 ******** 2025-03-22 23:15:36.357013 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:36.357260 | orchestrator | 2025-03-22 23:15:36.357293 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-03-22 23:15:36.357583 | orchestrator | Saturday 22 March 2025 23:15:36 +0000 (0:00:00.167) 0:00:18.052 ******** 2025-03-22 23:15:36.513958 | orchestrator | ok: [testbed-node-3] => { 2025-03-22 23:15:36.514494 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-03-22 23:15:36.514724 | orchestrator | } 2025-03-22 23:15:36.515350 | orchestrator | 2025-03-22 23:15:36.515994 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-03-22 23:15:36.516259 | orchestrator | Saturday 22 March 2025 23:15:36 +0000 (0:00:00.158) 0:00:18.211 ******** 2025-03-22 23:15:36.655780 | orchestrator | ok: [testbed-node-3] => { 2025-03-22 23:15:36.655904 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-03-22 23:15:36.656697 | orchestrator | } 2025-03-22 23:15:36.657117 | orchestrator | 2025-03-22 23:15:36.657619 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-03-22 23:15:36.657882 | orchestrator | Saturday 22 March 2025 23:15:36 +0000 (0:00:00.141) 0:00:18.352 ******** 2025-03-22 23:15:37.059794 | orchestrator | ok: [testbed-node-3] => { 2025-03-22 23:15:37.060230 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-03-22 23:15:37.061141 | orchestrator | } 2025-03-22 23:15:37.061806 | orchestrator | 2025-03-22 23:15:37.062097 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-03-22 23:15:37.062729 | orchestrator | Saturday 22 March 2025 23:15:37 +0000 (0:00:00.404) 0:00:18.756 ******** 2025-03-22 23:15:37.825691 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:37.826462 | orchestrator | 2025-03-22 23:15:37.829594 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-03-22 23:15:38.393345 | orchestrator | Saturday 22 March 2025 23:15:37 +0000 (0:00:00.764) 0:00:19.521 ******** 2025-03-22 23:15:38.393442 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:38.394005 | orchestrator | 2025-03-22 23:15:38.952638 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-03-22 23:15:38.953467 | orchestrator | Saturday 22 March 2025 23:15:38 +0000 (0:00:00.567) 0:00:20.088 ******** 2025-03-22 23:15:38.953547 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:38.953783 | orchestrator | 2025-03-22 23:15:38.954194 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-03-22 23:15:38.955321 | orchestrator | Saturday 22 March 2025 23:15:38 +0000 (0:00:00.560) 0:00:20.649 ******** 2025-03-22 23:15:39.146121 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:39.147076 | orchestrator | 2025-03-22 23:15:39.147162 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-03-22 23:15:39.147393 | orchestrator | Saturday 22 March 2025 23:15:39 +0000 (0:00:00.194) 0:00:20.843 ******** 2025-03-22 23:15:39.270219 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:39.270380 | orchestrator | 2025-03-22 23:15:39.271025 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-03-22 23:15:39.271530 | orchestrator | Saturday 22 March 2025 23:15:39 +0000 (0:00:00.122) 0:00:20.966 ******** 2025-03-22 23:15:39.377452 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:39.377666 | orchestrator | 2025-03-22 23:15:39.378694 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-03-22 23:15:39.379146 | orchestrator | Saturday 22 March 2025 23:15:39 +0000 (0:00:00.108) 0:00:21.074 ******** 2025-03-22 23:15:39.541585 | orchestrator | ok: [testbed-node-3] => { 2025-03-22 23:15:39.541986 | orchestrator |  "vgs_report": { 2025-03-22 23:15:39.543151 | orchestrator |  "vg": [] 2025-03-22 23:15:39.543837 | orchestrator |  } 2025-03-22 23:15:39.544728 | orchestrator | } 2025-03-22 23:15:39.545218 | orchestrator | 2025-03-22 23:15:39.548091 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-03-22 23:15:39.673814 | orchestrator | Saturday 22 March 2025 23:15:39 +0000 (0:00:00.164) 0:00:21.239 ******** 2025-03-22 23:15:39.673887 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:39.674605 | orchestrator | 2025-03-22 23:15:39.675889 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-03-22 23:15:39.676949 | orchestrator | Saturday 22 March 2025 23:15:39 +0000 (0:00:00.130) 0:00:21.369 ******** 2025-03-22 23:15:39.820338 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:39.820718 | orchestrator | 2025-03-22 23:15:39.821600 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-03-22 23:15:39.823645 | orchestrator | Saturday 22 March 2025 23:15:39 +0000 (0:00:00.147) 0:00:21.516 ******** 2025-03-22 23:15:39.960104 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:39.960266 | orchestrator | 2025-03-22 23:15:39.961594 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-03-22 23:15:39.961996 | orchestrator | Saturday 22 March 2025 23:15:39 +0000 (0:00:00.140) 0:00:21.656 ******** 2025-03-22 23:15:40.352620 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:40.354156 | orchestrator | 2025-03-22 23:15:40.354845 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-03-22 23:15:40.358167 | orchestrator | Saturday 22 March 2025 23:15:40 +0000 (0:00:00.390) 0:00:22.047 ******** 2025-03-22 23:15:40.536447 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:40.539029 | orchestrator | 2025-03-22 23:15:40.675607 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-03-22 23:15:40.675644 | orchestrator | Saturday 22 March 2025 23:15:40 +0000 (0:00:00.183) 0:00:22.231 ******** 2025-03-22 23:15:40.675664 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:40.677075 | orchestrator | 2025-03-22 23:15:40.678282 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-03-22 23:15:40.679050 | orchestrator | Saturday 22 March 2025 23:15:40 +0000 (0:00:00.141) 0:00:22.372 ******** 2025-03-22 23:15:40.831844 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:40.834220 | orchestrator | 2025-03-22 23:15:40.835186 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-03-22 23:15:40.835991 | orchestrator | Saturday 22 March 2025 23:15:40 +0000 (0:00:00.156) 0:00:22.528 ******** 2025-03-22 23:15:40.997671 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:40.997758 | orchestrator | 2025-03-22 23:15:40.998722 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-03-22 23:15:40.999123 | orchestrator | Saturday 22 March 2025 23:15:40 +0000 (0:00:00.166) 0:00:22.695 ******** 2025-03-22 23:15:41.138841 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:41.139032 | orchestrator | 2025-03-22 23:15:41.140054 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-03-22 23:15:41.140086 | orchestrator | Saturday 22 March 2025 23:15:41 +0000 (0:00:00.140) 0:00:22.835 ******** 2025-03-22 23:15:41.316029 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:41.316895 | orchestrator | 2025-03-22 23:15:41.318263 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-03-22 23:15:41.319037 | orchestrator | Saturday 22 March 2025 23:15:41 +0000 (0:00:00.175) 0:00:23.011 ******** 2025-03-22 23:15:41.464673 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:41.464777 | orchestrator | 2025-03-22 23:15:41.465380 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-03-22 23:15:41.466615 | orchestrator | Saturday 22 March 2025 23:15:41 +0000 (0:00:00.150) 0:00:23.161 ******** 2025-03-22 23:15:41.603972 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:41.604429 | orchestrator | 2025-03-22 23:15:41.605238 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-03-22 23:15:41.605791 | orchestrator | Saturday 22 March 2025 23:15:41 +0000 (0:00:00.139) 0:00:23.300 ******** 2025-03-22 23:15:41.759111 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:41.759944 | orchestrator | 2025-03-22 23:15:41.760808 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-03-22 23:15:41.761380 | orchestrator | Saturday 22 March 2025 23:15:41 +0000 (0:00:00.153) 0:00:23.454 ******** 2025-03-22 23:15:41.907956 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:41.908157 | orchestrator | 2025-03-22 23:15:41.909382 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-03-22 23:15:41.909599 | orchestrator | Saturday 22 March 2025 23:15:41 +0000 (0:00:00.150) 0:00:23.604 ******** 2025-03-22 23:15:42.344733 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:42.345727 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:42.347627 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:42.348645 | orchestrator | 2025-03-22 23:15:42.350204 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-03-22 23:15:42.540118 | orchestrator | Saturday 22 March 2025 23:15:42 +0000 (0:00:00.435) 0:00:24.040 ******** 2025-03-22 23:15:42.540198 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:42.540833 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:42.540866 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:42.541741 | orchestrator | 2025-03-22 23:15:42.541772 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-03-22 23:15:42.542437 | orchestrator | Saturday 22 March 2025 23:15:42 +0000 (0:00:00.195) 0:00:24.236 ******** 2025-03-22 23:15:42.724140 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:42.725008 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:42.725748 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:42.726629 | orchestrator | 2025-03-22 23:15:42.726949 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-03-22 23:15:42.727587 | orchestrator | Saturday 22 March 2025 23:15:42 +0000 (0:00:00.184) 0:00:24.421 ******** 2025-03-22 23:15:42.920694 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:42.921120 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:42.921155 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:42.921588 | orchestrator | 2025-03-22 23:15:42.922650 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-03-22 23:15:42.922847 | orchestrator | Saturday 22 March 2025 23:15:42 +0000 (0:00:00.194) 0:00:24.616 ******** 2025-03-22 23:15:43.142718 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:43.143918 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:43.145559 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:43.145693 | orchestrator | 2025-03-22 23:15:43.146537 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-03-22 23:15:43.147004 | orchestrator | Saturday 22 March 2025 23:15:43 +0000 (0:00:00.223) 0:00:24.839 ******** 2025-03-22 23:15:43.324654 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:43.325085 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:43.326431 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:43.326909 | orchestrator | 2025-03-22 23:15:43.327260 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-03-22 23:15:43.327706 | orchestrator | Saturday 22 March 2025 23:15:43 +0000 (0:00:00.180) 0:00:25.020 ******** 2025-03-22 23:15:43.505007 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:43.505529 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:43.506105 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:43.506836 | orchestrator | 2025-03-22 23:15:43.507585 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-03-22 23:15:43.678820 | orchestrator | Saturday 22 March 2025 23:15:43 +0000 (0:00:00.181) 0:00:25.201 ******** 2025-03-22 23:15:43.678895 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:43.679596 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:43.680194 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:43.680754 | orchestrator | 2025-03-22 23:15:43.681309 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-03-22 23:15:43.681697 | orchestrator | Saturday 22 March 2025 23:15:43 +0000 (0:00:00.169) 0:00:25.371 ******** 2025-03-22 23:15:44.240218 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:44.242699 | orchestrator | 2025-03-22 23:15:44.774441 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-03-22 23:15:44.774596 | orchestrator | Saturday 22 March 2025 23:15:44 +0000 (0:00:00.560) 0:00:25.932 ******** 2025-03-22 23:15:44.774630 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:44.775296 | orchestrator | 2025-03-22 23:15:44.775967 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-03-22 23:15:44.776848 | orchestrator | Saturday 22 March 2025 23:15:44 +0000 (0:00:00.537) 0:00:26.470 ******** 2025-03-22 23:15:44.945872 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:15:44.947012 | orchestrator | 2025-03-22 23:15:44.947840 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-03-22 23:15:44.948144 | orchestrator | Saturday 22 March 2025 23:15:44 +0000 (0:00:00.171) 0:00:26.641 ******** 2025-03-22 23:15:45.383243 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'vg_name': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'}) 2025-03-22 23:15:45.383393 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'vg_name': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'}) 2025-03-22 23:15:45.383418 | orchestrator | 2025-03-22 23:15:45.383938 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-03-22 23:15:45.385619 | orchestrator | Saturday 22 March 2025 23:15:45 +0000 (0:00:00.436) 0:00:27.077 ******** 2025-03-22 23:15:45.580638 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:45.582344 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:45.583758 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:45.585333 | orchestrator | 2025-03-22 23:15:45.585643 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-03-22 23:15:45.585674 | orchestrator | Saturday 22 March 2025 23:15:45 +0000 (0:00:00.200) 0:00:27.277 ******** 2025-03-22 23:15:45.805842 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:45.807001 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:45.807654 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:45.808354 | orchestrator | 2025-03-22 23:15:45.809018 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-03-22 23:15:45.809416 | orchestrator | Saturday 22 March 2025 23:15:45 +0000 (0:00:00.224) 0:00:27.502 ******** 2025-03-22 23:15:45.992740 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b', 'data_vg': 'ceph-43a113f0-cd75-588c-85b3-7699e063bb3b'})  2025-03-22 23:15:45.993823 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5', 'data_vg': 'ceph-e04c708b-d456-5156-8bd3-75c09a375fc5'})  2025-03-22 23:15:45.994859 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:15:45.995268 | orchestrator | 2025-03-22 23:15:45.996155 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-03-22 23:15:45.996761 | orchestrator | Saturday 22 March 2025 23:15:45 +0000 (0:00:00.185) 0:00:27.688 ******** 2025-03-22 23:15:47.007183 | orchestrator | ok: [testbed-node-3] => { 2025-03-22 23:15:47.007418 | orchestrator |  "lvm_report": { 2025-03-22 23:15:47.007726 | orchestrator |  "lv": [ 2025-03-22 23:15:47.007761 | orchestrator |  { 2025-03-22 23:15:47.007889 | orchestrator |  "lv_name": "osd-block-43a113f0-cd75-588c-85b3-7699e063bb3b", 2025-03-22 23:15:47.008147 | orchestrator |  "vg_name": "ceph-43a113f0-cd75-588c-85b3-7699e063bb3b" 2025-03-22 23:15:47.008508 | orchestrator |  }, 2025-03-22 23:15:47.008793 | orchestrator |  { 2025-03-22 23:15:47.008988 | orchestrator |  "lv_name": "osd-block-e04c708b-d456-5156-8bd3-75c09a375fc5", 2025-03-22 23:15:47.009563 | orchestrator |  "vg_name": "ceph-e04c708b-d456-5156-8bd3-75c09a375fc5" 2025-03-22 23:15:47.009813 | orchestrator |  } 2025-03-22 23:15:47.010175 | orchestrator |  ], 2025-03-22 23:15:47.011713 | orchestrator |  "pv": [ 2025-03-22 23:15:47.012252 | orchestrator |  { 2025-03-22 23:15:47.012591 | orchestrator |  "pv_name": "/dev/sdb", 2025-03-22 23:15:47.013129 | orchestrator |  "vg_name": "ceph-43a113f0-cd75-588c-85b3-7699e063bb3b" 2025-03-22 23:15:47.013329 | orchestrator |  }, 2025-03-22 23:15:47.016391 | orchestrator |  { 2025-03-22 23:15:47.017034 | orchestrator |  "pv_name": "/dev/sdc", 2025-03-22 23:15:47.017058 | orchestrator |  "vg_name": "ceph-e04c708b-d456-5156-8bd3-75c09a375fc5" 2025-03-22 23:15:47.017072 | orchestrator |  } 2025-03-22 23:15:47.017087 | orchestrator |  ] 2025-03-22 23:15:47.017102 | orchestrator |  } 2025-03-22 23:15:47.017116 | orchestrator | } 2025-03-22 23:15:47.017135 | orchestrator | 2025-03-22 23:15:47.017334 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-03-22 23:15:47.017694 | orchestrator | 2025-03-22 23:15:47.017894 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-22 23:15:47.017922 | orchestrator | Saturday 22 March 2025 23:15:47 +0000 (0:00:01.015) 0:00:28.703 ******** 2025-03-22 23:15:47.279328 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-03-22 23:15:47.280256 | orchestrator | 2025-03-22 23:15:47.280644 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-22 23:15:47.281380 | orchestrator | Saturday 22 March 2025 23:15:47 +0000 (0:00:00.272) 0:00:28.976 ******** 2025-03-22 23:15:47.523456 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:15:47.524299 | orchestrator | 2025-03-22 23:15:47.524891 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:47.527940 | orchestrator | Saturday 22 March 2025 23:15:47 +0000 (0:00:00.243) 0:00:29.219 ******** 2025-03-22 23:15:48.045661 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-03-22 23:15:48.046142 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-03-22 23:15:48.050001 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-03-22 23:15:48.050118 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-03-22 23:15:48.050160 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-03-22 23:15:48.050181 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-03-22 23:15:48.050464 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-03-22 23:15:48.050828 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-03-22 23:15:48.051338 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-03-22 23:15:48.051769 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-03-22 23:15:48.052271 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-03-22 23:15:48.052563 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-03-22 23:15:48.053590 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-03-22 23:15:48.056299 | orchestrator | 2025-03-22 23:15:48.056899 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:48.057365 | orchestrator | Saturday 22 March 2025 23:15:48 +0000 (0:00:00.521) 0:00:29.740 ******** 2025-03-22 23:15:48.280950 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:48.281174 | orchestrator | 2025-03-22 23:15:48.281680 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:48.282129 | orchestrator | Saturday 22 March 2025 23:15:48 +0000 (0:00:00.237) 0:00:29.978 ******** 2025-03-22 23:15:48.501167 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:48.501517 | orchestrator | 2025-03-22 23:15:48.501559 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:48.501740 | orchestrator | Saturday 22 March 2025 23:15:48 +0000 (0:00:00.219) 0:00:30.197 ******** 2025-03-22 23:15:48.725860 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:48.726829 | orchestrator | 2025-03-22 23:15:48.726864 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:48.728060 | orchestrator | Saturday 22 March 2025 23:15:48 +0000 (0:00:00.224) 0:00:30.421 ******** 2025-03-22 23:15:48.936917 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:48.937282 | orchestrator | 2025-03-22 23:15:48.937553 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:48.937861 | orchestrator | Saturday 22 March 2025 23:15:48 +0000 (0:00:00.212) 0:00:30.634 ******** 2025-03-22 23:15:49.173084 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:49.173541 | orchestrator | 2025-03-22 23:15:49.173993 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:49.174245 | orchestrator | Saturday 22 March 2025 23:15:49 +0000 (0:00:00.235) 0:00:30.869 ******** 2025-03-22 23:15:49.894292 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:49.894736 | orchestrator | 2025-03-22 23:15:49.894768 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:49.894789 | orchestrator | Saturday 22 March 2025 23:15:49 +0000 (0:00:00.720) 0:00:31.590 ******** 2025-03-22 23:15:50.138757 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:50.140231 | orchestrator | 2025-03-22 23:15:50.141099 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:50.142226 | orchestrator | Saturday 22 March 2025 23:15:50 +0000 (0:00:00.241) 0:00:31.832 ******** 2025-03-22 23:15:50.387159 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:50.387613 | orchestrator | 2025-03-22 23:15:50.388915 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:50.391534 | orchestrator | Saturday 22 March 2025 23:15:50 +0000 (0:00:00.251) 0:00:32.084 ******** 2025-03-22 23:15:50.816889 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_d5017ffd-d0d5-431d-84a2-17c0a06b39b8) 2025-03-22 23:15:50.818319 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_d5017ffd-d0d5-431d-84a2-17c0a06b39b8) 2025-03-22 23:15:50.818365 | orchestrator | 2025-03-22 23:15:50.819256 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:50.820125 | orchestrator | Saturday 22 March 2025 23:15:50 +0000 (0:00:00.429) 0:00:32.513 ******** 2025-03-22 23:15:51.315499 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_d6bc4934-4c34-4893-bb71-8a867393eb36) 2025-03-22 23:15:51.316628 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_d6bc4934-4c34-4893-bb71-8a867393eb36) 2025-03-22 23:15:51.319449 | orchestrator | 2025-03-22 23:15:51.319640 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:51.320578 | orchestrator | Saturday 22 March 2025 23:15:51 +0000 (0:00:00.498) 0:00:33.012 ******** 2025-03-22 23:15:51.777562 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_6454730d-d769-486b-8e2e-775b81470741) 2025-03-22 23:15:51.779039 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_6454730d-d769-486b-8e2e-775b81470741) 2025-03-22 23:15:51.779898 | orchestrator | 2025-03-22 23:15:51.780258 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:51.781429 | orchestrator | Saturday 22 March 2025 23:15:51 +0000 (0:00:00.460) 0:00:33.472 ******** 2025-03-22 23:15:52.280635 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_65c63fa5-5e6f-4c5d-b367-79c528cb404f) 2025-03-22 23:15:52.281021 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_65c63fa5-5e6f-4c5d-b367-79c528cb404f) 2025-03-22 23:15:52.281540 | orchestrator | 2025-03-22 23:15:52.282278 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:15:52.282715 | orchestrator | Saturday 22 March 2025 23:15:52 +0000 (0:00:00.505) 0:00:33.978 ******** 2025-03-22 23:15:52.650727 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-22 23:15:52.651507 | orchestrator | 2025-03-22 23:15:52.652540 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:52.653738 | orchestrator | Saturday 22 March 2025 23:15:52 +0000 (0:00:00.368) 0:00:34.346 ******** 2025-03-22 23:15:53.222222 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-03-22 23:15:53.223678 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-03-22 23:15:53.224021 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-03-22 23:15:53.225073 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-03-22 23:15:53.226646 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-03-22 23:15:53.228229 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-03-22 23:15:53.229695 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-03-22 23:15:53.230203 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-03-22 23:15:53.230969 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-03-22 23:15:53.231580 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-03-22 23:15:53.232416 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-03-22 23:15:53.233505 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-03-22 23:15:53.233897 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-03-22 23:15:53.234736 | orchestrator | 2025-03-22 23:15:53.235637 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:53.236066 | orchestrator | Saturday 22 March 2025 23:15:53 +0000 (0:00:00.572) 0:00:34.919 ******** 2025-03-22 23:15:53.898615 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:53.898764 | orchestrator | 2025-03-22 23:15:53.898922 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:53.900469 | orchestrator | Saturday 22 March 2025 23:15:53 +0000 (0:00:00.674) 0:00:35.593 ******** 2025-03-22 23:15:54.142511 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:54.142726 | orchestrator | 2025-03-22 23:15:54.143245 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:54.368090 | orchestrator | Saturday 22 March 2025 23:15:54 +0000 (0:00:00.246) 0:00:35.840 ******** 2025-03-22 23:15:54.368175 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:54.368276 | orchestrator | 2025-03-22 23:15:54.368749 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:54.368896 | orchestrator | Saturday 22 March 2025 23:15:54 +0000 (0:00:00.225) 0:00:36.065 ******** 2025-03-22 23:15:54.570767 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:54.571700 | orchestrator | 2025-03-22 23:15:54.572817 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:54.573683 | orchestrator | Saturday 22 March 2025 23:15:54 +0000 (0:00:00.199) 0:00:36.265 ******** 2025-03-22 23:15:54.806159 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:54.806318 | orchestrator | 2025-03-22 23:15:54.807444 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:54.807561 | orchestrator | Saturday 22 March 2025 23:15:54 +0000 (0:00:00.236) 0:00:36.502 ******** 2025-03-22 23:15:55.008038 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:55.008803 | orchestrator | 2025-03-22 23:15:55.010108 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:55.010824 | orchestrator | Saturday 22 March 2025 23:15:55 +0000 (0:00:00.201) 0:00:36.704 ******** 2025-03-22 23:15:55.256398 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:55.257763 | orchestrator | 2025-03-22 23:15:55.258073 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:55.259279 | orchestrator | Saturday 22 March 2025 23:15:55 +0000 (0:00:00.249) 0:00:36.953 ******** 2025-03-22 23:15:55.502663 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:55.504605 | orchestrator | 2025-03-22 23:15:55.505593 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:55.506236 | orchestrator | Saturday 22 March 2025 23:15:55 +0000 (0:00:00.245) 0:00:37.199 ******** 2025-03-22 23:15:56.493661 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-03-22 23:15:56.494221 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-03-22 23:15:56.495364 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-03-22 23:15:56.496534 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-03-22 23:15:56.497445 | orchestrator | 2025-03-22 23:15:56.498373 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:56.499045 | orchestrator | Saturday 22 March 2025 23:15:56 +0000 (0:00:00.990) 0:00:38.189 ******** 2025-03-22 23:15:56.718811 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:56.720138 | orchestrator | 2025-03-22 23:15:56.721052 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:56.722348 | orchestrator | Saturday 22 March 2025 23:15:56 +0000 (0:00:00.221) 0:00:38.411 ******** 2025-03-22 23:15:57.435622 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:57.436518 | orchestrator | 2025-03-22 23:15:57.437647 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:57.438359 | orchestrator | Saturday 22 March 2025 23:15:57 +0000 (0:00:00.720) 0:00:39.132 ******** 2025-03-22 23:15:57.655567 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:57.656833 | orchestrator | 2025-03-22 23:15:57.657638 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:15:57.658738 | orchestrator | Saturday 22 March 2025 23:15:57 +0000 (0:00:00.218) 0:00:39.351 ******** 2025-03-22 23:15:57.870983 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:57.871260 | orchestrator | 2025-03-22 23:15:57.871294 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-03-22 23:15:57.871611 | orchestrator | Saturday 22 March 2025 23:15:57 +0000 (0:00:00.216) 0:00:39.568 ******** 2025-03-22 23:15:58.030185 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:15:58.031026 | orchestrator | 2025-03-22 23:15:58.031056 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-03-22 23:15:58.031079 | orchestrator | Saturday 22 March 2025 23:15:58 +0000 (0:00:00.157) 0:00:39.725 ******** 2025-03-22 23:15:58.274189 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'c38313fe-ae28-50de-b682-b60e7793463e'}}) 2025-03-22 23:15:58.274751 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'b5bb1ca0-c565-576e-8272-e005b053e8a9'}}) 2025-03-22 23:15:58.275582 | orchestrator | 2025-03-22 23:15:58.279910 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-03-22 23:15:58.281131 | orchestrator | Saturday 22 March 2025 23:15:58 +0000 (0:00:00.245) 0:00:39.970 ******** 2025-03-22 23:16:00.420159 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'}) 2025-03-22 23:16:00.420381 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'}) 2025-03-22 23:16:00.420959 | orchestrator | 2025-03-22 23:16:00.421248 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-03-22 23:16:00.421878 | orchestrator | Saturday 22 March 2025 23:16:00 +0000 (0:00:02.145) 0:00:42.116 ******** 2025-03-22 23:16:00.614152 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:00.614353 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:00.614701 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:00.615443 | orchestrator | 2025-03-22 23:16:00.616188 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-03-22 23:16:00.616702 | orchestrator | Saturday 22 March 2025 23:16:00 +0000 (0:00:00.193) 0:00:42.310 ******** 2025-03-22 23:16:01.977111 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'}) 2025-03-22 23:16:01.977287 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'}) 2025-03-22 23:16:01.978197 | orchestrator | 2025-03-22 23:16:01.979883 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-03-22 23:16:01.979947 | orchestrator | Saturday 22 March 2025 23:16:01 +0000 (0:00:01.361) 0:00:43.671 ******** 2025-03-22 23:16:02.158707 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:02.158836 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:02.159952 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:02.160530 | orchestrator | 2025-03-22 23:16:02.161155 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-03-22 23:16:02.161842 | orchestrator | Saturday 22 March 2025 23:16:02 +0000 (0:00:00.183) 0:00:43.854 ******** 2025-03-22 23:16:02.534877 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:02.538676 | orchestrator | 2025-03-22 23:16:02.539468 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-03-22 23:16:02.539786 | orchestrator | Saturday 22 March 2025 23:16:02 +0000 (0:00:00.375) 0:00:44.230 ******** 2025-03-22 23:16:02.721831 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:02.723160 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:02.723932 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:02.725018 | orchestrator | 2025-03-22 23:16:02.725132 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-03-22 23:16:02.725859 | orchestrator | Saturday 22 March 2025 23:16:02 +0000 (0:00:00.185) 0:00:44.416 ******** 2025-03-22 23:16:02.872237 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:02.872704 | orchestrator | 2025-03-22 23:16:02.872814 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-03-22 23:16:02.873740 | orchestrator | Saturday 22 March 2025 23:16:02 +0000 (0:00:00.152) 0:00:44.569 ******** 2025-03-22 23:16:03.085004 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:03.086967 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:03.090511 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:03.259939 | orchestrator | 2025-03-22 23:16:03.260001 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-03-22 23:16:03.260017 | orchestrator | Saturday 22 March 2025 23:16:03 +0000 (0:00:00.212) 0:00:44.781 ******** 2025-03-22 23:16:03.260042 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:03.260268 | orchestrator | 2025-03-22 23:16:03.261381 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-03-22 23:16:03.262128 | orchestrator | Saturday 22 March 2025 23:16:03 +0000 (0:00:00.174) 0:00:44.956 ******** 2025-03-22 23:16:03.445694 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:03.446312 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:03.446397 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:03.447103 | orchestrator | 2025-03-22 23:16:03.447748 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-03-22 23:16:03.448736 | orchestrator | Saturday 22 March 2025 23:16:03 +0000 (0:00:00.187) 0:00:45.143 ******** 2025-03-22 23:16:03.609539 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:03.610346 | orchestrator | 2025-03-22 23:16:03.611021 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-03-22 23:16:03.612174 | orchestrator | Saturday 22 March 2025 23:16:03 +0000 (0:00:00.162) 0:00:45.305 ******** 2025-03-22 23:16:03.781742 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:03.782273 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:03.784246 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:03.785826 | orchestrator | 2025-03-22 23:16:03.786344 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-03-22 23:16:03.788610 | orchestrator | Saturday 22 March 2025 23:16:03 +0000 (0:00:00.172) 0:00:45.478 ******** 2025-03-22 23:16:03.973583 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:03.974522 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:03.975257 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:03.978296 | orchestrator | 2025-03-22 23:16:03.978748 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-03-22 23:16:03.978777 | orchestrator | Saturday 22 March 2025 23:16:03 +0000 (0:00:00.192) 0:00:45.670 ******** 2025-03-22 23:16:04.159687 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:04.160239 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:04.160856 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:04.161190 | orchestrator | 2025-03-22 23:16:04.161782 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-03-22 23:16:04.162204 | orchestrator | Saturday 22 March 2025 23:16:04 +0000 (0:00:00.186) 0:00:45.856 ******** 2025-03-22 23:16:04.307799 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:04.308034 | orchestrator | 2025-03-22 23:16:04.308384 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-03-22 23:16:04.309334 | orchestrator | Saturday 22 March 2025 23:16:04 +0000 (0:00:00.148) 0:00:46.005 ******** 2025-03-22 23:16:04.681211 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:04.684695 | orchestrator | 2025-03-22 23:16:04.685122 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-03-22 23:16:04.685144 | orchestrator | Saturday 22 March 2025 23:16:04 +0000 (0:00:00.371) 0:00:46.376 ******** 2025-03-22 23:16:04.833181 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:04.836073 | orchestrator | 2025-03-22 23:16:04.837534 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-03-22 23:16:04.838667 | orchestrator | Saturday 22 March 2025 23:16:04 +0000 (0:00:00.152) 0:00:46.528 ******** 2025-03-22 23:16:04.992392 | orchestrator | ok: [testbed-node-4] => { 2025-03-22 23:16:04.993661 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-03-22 23:16:04.996370 | orchestrator | } 2025-03-22 23:16:04.997117 | orchestrator | 2025-03-22 23:16:04.997139 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-03-22 23:16:04.997785 | orchestrator | Saturday 22 March 2025 23:16:04 +0000 (0:00:00.159) 0:00:46.688 ******** 2025-03-22 23:16:05.134737 | orchestrator | ok: [testbed-node-4] => { 2025-03-22 23:16:05.135291 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-03-22 23:16:05.136078 | orchestrator | } 2025-03-22 23:16:05.137154 | orchestrator | 2025-03-22 23:16:05.138143 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-03-22 23:16:05.139007 | orchestrator | Saturday 22 March 2025 23:16:05 +0000 (0:00:00.141) 0:00:46.830 ******** 2025-03-22 23:16:05.312149 | orchestrator | ok: [testbed-node-4] => { 2025-03-22 23:16:05.314094 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-03-22 23:16:05.314540 | orchestrator | } 2025-03-22 23:16:05.315471 | orchestrator | 2025-03-22 23:16:05.317035 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-03-22 23:16:05.833436 | orchestrator | Saturday 22 March 2025 23:16:05 +0000 (0:00:00.178) 0:00:47.008 ******** 2025-03-22 23:16:05.833556 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:05.833744 | orchestrator | 2025-03-22 23:16:05.834925 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-03-22 23:16:05.835272 | orchestrator | Saturday 22 March 2025 23:16:05 +0000 (0:00:00.520) 0:00:47.529 ******** 2025-03-22 23:16:06.392583 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:06.393146 | orchestrator | 2025-03-22 23:16:06.393654 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-03-22 23:16:06.394422 | orchestrator | Saturday 22 March 2025 23:16:06 +0000 (0:00:00.559) 0:00:48.088 ******** 2025-03-22 23:16:06.923649 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:06.924169 | orchestrator | 2025-03-22 23:16:06.924610 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-03-22 23:16:06.925919 | orchestrator | Saturday 22 March 2025 23:16:06 +0000 (0:00:00.531) 0:00:48.620 ******** 2025-03-22 23:16:07.083859 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:07.084240 | orchestrator | 2025-03-22 23:16:07.084706 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-03-22 23:16:07.085778 | orchestrator | Saturday 22 March 2025 23:16:07 +0000 (0:00:00.159) 0:00:48.780 ******** 2025-03-22 23:16:07.198619 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:07.319012 | orchestrator | 2025-03-22 23:16:07.319070 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-03-22 23:16:07.319087 | orchestrator | Saturday 22 March 2025 23:16:07 +0000 (0:00:00.113) 0:00:48.894 ******** 2025-03-22 23:16:07.319112 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:07.320118 | orchestrator | 2025-03-22 23:16:07.321126 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-03-22 23:16:07.322115 | orchestrator | Saturday 22 March 2025 23:16:07 +0000 (0:00:00.122) 0:00:49.016 ******** 2025-03-22 23:16:07.709002 | orchestrator | ok: [testbed-node-4] => { 2025-03-22 23:16:07.710074 | orchestrator |  "vgs_report": { 2025-03-22 23:16:07.711444 | orchestrator |  "vg": [] 2025-03-22 23:16:07.712576 | orchestrator |  } 2025-03-22 23:16:07.713668 | orchestrator | } 2025-03-22 23:16:07.715038 | orchestrator | 2025-03-22 23:16:07.715406 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-03-22 23:16:07.716064 | orchestrator | Saturday 22 March 2025 23:16:07 +0000 (0:00:00.386) 0:00:49.402 ******** 2025-03-22 23:16:07.872001 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:07.872546 | orchestrator | 2025-03-22 23:16:07.873044 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-03-22 23:16:07.876188 | orchestrator | Saturday 22 March 2025 23:16:07 +0000 (0:00:00.164) 0:00:49.566 ******** 2025-03-22 23:16:08.028177 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:08.029322 | orchestrator | 2025-03-22 23:16:08.029641 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-03-22 23:16:08.030589 | orchestrator | Saturday 22 March 2025 23:16:08 +0000 (0:00:00.158) 0:00:49.725 ******** 2025-03-22 23:16:08.173086 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:08.174229 | orchestrator | 2025-03-22 23:16:08.174270 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-03-22 23:16:08.311124 | orchestrator | Saturday 22 March 2025 23:16:08 +0000 (0:00:00.144) 0:00:49.870 ******** 2025-03-22 23:16:08.311242 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:08.311362 | orchestrator | 2025-03-22 23:16:08.311586 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-03-22 23:16:08.311854 | orchestrator | Saturday 22 March 2025 23:16:08 +0000 (0:00:00.138) 0:00:50.008 ******** 2025-03-22 23:16:08.452834 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:08.453312 | orchestrator | 2025-03-22 23:16:08.453346 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-03-22 23:16:08.453817 | orchestrator | Saturday 22 March 2025 23:16:08 +0000 (0:00:00.141) 0:00:50.150 ******** 2025-03-22 23:16:08.600533 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:08.601680 | orchestrator | 2025-03-22 23:16:08.603023 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-03-22 23:16:08.603177 | orchestrator | Saturday 22 March 2025 23:16:08 +0000 (0:00:00.146) 0:00:50.296 ******** 2025-03-22 23:16:08.754139 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:08.756172 | orchestrator | 2025-03-22 23:16:08.756324 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-03-22 23:16:08.756664 | orchestrator | Saturday 22 March 2025 23:16:08 +0000 (0:00:00.153) 0:00:50.449 ******** 2025-03-22 23:16:08.928642 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:08.929122 | orchestrator | 2025-03-22 23:16:08.930459 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-03-22 23:16:08.930872 | orchestrator | Saturday 22 March 2025 23:16:08 +0000 (0:00:00.174) 0:00:50.623 ******** 2025-03-22 23:16:09.090314 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:09.091451 | orchestrator | 2025-03-22 23:16:09.091827 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-03-22 23:16:09.093091 | orchestrator | Saturday 22 March 2025 23:16:09 +0000 (0:00:00.162) 0:00:50.786 ******** 2025-03-22 23:16:09.250087 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:09.250790 | orchestrator | 2025-03-22 23:16:09.251017 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-03-22 23:16:09.251370 | orchestrator | Saturday 22 March 2025 23:16:09 +0000 (0:00:00.160) 0:00:50.947 ******** 2025-03-22 23:16:09.399065 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:09.399609 | orchestrator | 2025-03-22 23:16:09.400051 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-03-22 23:16:09.400687 | orchestrator | Saturday 22 March 2025 23:16:09 +0000 (0:00:00.148) 0:00:51.095 ******** 2025-03-22 23:16:09.777404 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:09.778092 | orchestrator | 2025-03-22 23:16:09.779852 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-03-22 23:16:09.780569 | orchestrator | Saturday 22 March 2025 23:16:09 +0000 (0:00:00.376) 0:00:51.471 ******** 2025-03-22 23:16:09.938842 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:09.939673 | orchestrator | 2025-03-22 23:16:09.940555 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-03-22 23:16:09.941056 | orchestrator | Saturday 22 March 2025 23:16:09 +0000 (0:00:00.163) 0:00:51.635 ******** 2025-03-22 23:16:10.103326 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:10.104423 | orchestrator | 2025-03-22 23:16:10.105970 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-03-22 23:16:10.107571 | orchestrator | Saturday 22 March 2025 23:16:10 +0000 (0:00:00.164) 0:00:51.799 ******** 2025-03-22 23:16:10.312620 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:10.312818 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:10.314563 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:10.315782 | orchestrator | 2025-03-22 23:16:10.319093 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-03-22 23:16:10.319330 | orchestrator | Saturday 22 March 2025 23:16:10 +0000 (0:00:00.208) 0:00:52.008 ******** 2025-03-22 23:16:10.481785 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:10.482563 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:10.483250 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:10.483756 | orchestrator | 2025-03-22 23:16:10.484450 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-03-22 23:16:10.484568 | orchestrator | Saturday 22 March 2025 23:16:10 +0000 (0:00:00.170) 0:00:52.179 ******** 2025-03-22 23:16:10.661369 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:10.661861 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:10.662590 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:10.665520 | orchestrator | 2025-03-22 23:16:10.665552 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-03-22 23:16:10.666377 | orchestrator | Saturday 22 March 2025 23:16:10 +0000 (0:00:00.178) 0:00:52.358 ******** 2025-03-22 23:16:10.838464 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:10.838669 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:10.840250 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:10.841104 | orchestrator | 2025-03-22 23:16:10.841780 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-03-22 23:16:10.842531 | orchestrator | Saturday 22 March 2025 23:16:10 +0000 (0:00:00.176) 0:00:52.534 ******** 2025-03-22 23:16:11.029695 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:11.030793 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:11.031512 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:11.032410 | orchestrator | 2025-03-22 23:16:11.032734 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-03-22 23:16:11.033103 | orchestrator | Saturday 22 March 2025 23:16:11 +0000 (0:00:00.191) 0:00:52.726 ******** 2025-03-22 23:16:11.205373 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:11.206590 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:11.208002 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:11.208571 | orchestrator | 2025-03-22 23:16:11.210278 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-03-22 23:16:11.210560 | orchestrator | Saturday 22 March 2025 23:16:11 +0000 (0:00:00.174) 0:00:52.901 ******** 2025-03-22 23:16:11.381574 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:11.382096 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:11.382608 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:11.383454 | orchestrator | 2025-03-22 23:16:11.384098 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-03-22 23:16:11.384839 | orchestrator | Saturday 22 March 2025 23:16:11 +0000 (0:00:00.176) 0:00:53.078 ******** 2025-03-22 23:16:11.547058 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:11.548500 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:11.549284 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:11.549608 | orchestrator | 2025-03-22 23:16:11.550779 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-03-22 23:16:11.553721 | orchestrator | Saturday 22 March 2025 23:16:11 +0000 (0:00:00.165) 0:00:53.243 ******** 2025-03-22 23:16:12.326194 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:12.327284 | orchestrator | 2025-03-22 23:16:12.328274 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-03-22 23:16:12.328296 | orchestrator | Saturday 22 March 2025 23:16:12 +0000 (0:00:00.778) 0:00:54.022 ******** 2025-03-22 23:16:12.891767 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:12.892424 | orchestrator | 2025-03-22 23:16:12.893148 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-03-22 23:16:12.894007 | orchestrator | Saturday 22 March 2025 23:16:12 +0000 (0:00:00.564) 0:00:54.587 ******** 2025-03-22 23:16:13.078633 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:13.079211 | orchestrator | 2025-03-22 23:16:13.079635 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-03-22 23:16:13.080362 | orchestrator | Saturday 22 March 2025 23:16:13 +0000 (0:00:00.188) 0:00:54.775 ******** 2025-03-22 23:16:13.296662 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'vg_name': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'}) 2025-03-22 23:16:13.296955 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'vg_name': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'}) 2025-03-22 23:16:13.297513 | orchestrator | 2025-03-22 23:16:13.297945 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-03-22 23:16:13.298531 | orchestrator | Saturday 22 March 2025 23:16:13 +0000 (0:00:00.218) 0:00:54.994 ******** 2025-03-22 23:16:13.482100 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:13.482922 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:13.482945 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:13.484192 | orchestrator | 2025-03-22 23:16:13.485594 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-03-22 23:16:13.688382 | orchestrator | Saturday 22 March 2025 23:16:13 +0000 (0:00:00.183) 0:00:55.178 ******** 2025-03-22 23:16:13.688424 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:13.690600 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:13.691036 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:13.692122 | orchestrator | 2025-03-22 23:16:13.692822 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-03-22 23:16:13.693583 | orchestrator | Saturday 22 March 2025 23:16:13 +0000 (0:00:00.207) 0:00:55.385 ******** 2025-03-22 23:16:13.870414 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-c38313fe-ae28-50de-b682-b60e7793463e', 'data_vg': 'ceph-c38313fe-ae28-50de-b682-b60e7793463e'})  2025-03-22 23:16:13.871252 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9', 'data_vg': 'ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9'})  2025-03-22 23:16:13.871709 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:13.871889 | orchestrator | 2025-03-22 23:16:13.872742 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-03-22 23:16:13.872998 | orchestrator | Saturday 22 March 2025 23:16:13 +0000 (0:00:00.180) 0:00:55.566 ******** 2025-03-22 23:16:14.898617 | orchestrator | ok: [testbed-node-4] => { 2025-03-22 23:16:14.899641 | orchestrator |  "lvm_report": { 2025-03-22 23:16:14.901752 | orchestrator |  "lv": [ 2025-03-22 23:16:14.902453 | orchestrator |  { 2025-03-22 23:16:14.904309 | orchestrator |  "lv_name": "osd-block-b5bb1ca0-c565-576e-8272-e005b053e8a9", 2025-03-22 23:16:14.904734 | orchestrator |  "vg_name": "ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9" 2025-03-22 23:16:14.905458 | orchestrator |  }, 2025-03-22 23:16:14.905687 | orchestrator |  { 2025-03-22 23:16:14.906767 | orchestrator |  "lv_name": "osd-block-c38313fe-ae28-50de-b682-b60e7793463e", 2025-03-22 23:16:14.907506 | orchestrator |  "vg_name": "ceph-c38313fe-ae28-50de-b682-b60e7793463e" 2025-03-22 23:16:14.907903 | orchestrator |  } 2025-03-22 23:16:14.908691 | orchestrator |  ], 2025-03-22 23:16:14.909081 | orchestrator |  "pv": [ 2025-03-22 23:16:14.912427 | orchestrator |  { 2025-03-22 23:16:14.913364 | orchestrator |  "pv_name": "/dev/sdb", 2025-03-22 23:16:14.913404 | orchestrator |  "vg_name": "ceph-c38313fe-ae28-50de-b682-b60e7793463e" 2025-03-22 23:16:14.915413 | orchestrator |  }, 2025-03-22 23:16:14.915537 | orchestrator |  { 2025-03-22 23:16:14.916313 | orchestrator |  "pv_name": "/dev/sdc", 2025-03-22 23:16:14.918397 | orchestrator |  "vg_name": "ceph-b5bb1ca0-c565-576e-8272-e005b053e8a9" 2025-03-22 23:16:14.919582 | orchestrator |  } 2025-03-22 23:16:14.920355 | orchestrator |  ] 2025-03-22 23:16:14.922184 | orchestrator |  } 2025-03-22 23:16:14.923155 | orchestrator | } 2025-03-22 23:16:14.925633 | orchestrator | 2025-03-22 23:16:14.925958 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-03-22 23:16:14.926822 | orchestrator | 2025-03-22 23:16:14.929180 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-03-22 23:16:14.929699 | orchestrator | Saturday 22 March 2025 23:16:14 +0000 (0:00:01.025) 0:00:56.592 ******** 2025-03-22 23:16:15.190196 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-03-22 23:16:15.191443 | orchestrator | 2025-03-22 23:16:15.192949 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-03-22 23:16:15.193760 | orchestrator | Saturday 22 March 2025 23:16:15 +0000 (0:00:00.294) 0:00:56.886 ******** 2025-03-22 23:16:15.488757 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:15.489756 | orchestrator | 2025-03-22 23:16:15.490696 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:15.490772 | orchestrator | Saturday 22 March 2025 23:16:15 +0000 (0:00:00.298) 0:00:57.185 ******** 2025-03-22 23:16:16.033935 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-03-22 23:16:16.037682 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-03-22 23:16:16.037735 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-03-22 23:16:16.038898 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-03-22 23:16:16.040737 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-03-22 23:16:16.041414 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-03-22 23:16:16.041853 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-03-22 23:16:16.042602 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-03-22 23:16:16.043076 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-03-22 23:16:16.044456 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-03-22 23:16:16.046568 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-03-22 23:16:16.046601 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-03-22 23:16:16.046829 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-03-22 23:16:16.047594 | orchestrator | 2025-03-22 23:16:16.048172 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:16.048851 | orchestrator | Saturday 22 March 2025 23:16:16 +0000 (0:00:00.542) 0:00:57.727 ******** 2025-03-22 23:16:16.263642 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:16.263787 | orchestrator | 2025-03-22 23:16:16.264342 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:16.264830 | orchestrator | Saturday 22 March 2025 23:16:16 +0000 (0:00:00.231) 0:00:57.959 ******** 2025-03-22 23:16:16.477975 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:16.478462 | orchestrator | 2025-03-22 23:16:16.478976 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:16.479408 | orchestrator | Saturday 22 March 2025 23:16:16 +0000 (0:00:00.215) 0:00:58.175 ******** 2025-03-22 23:16:16.680003 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:16.680365 | orchestrator | 2025-03-22 23:16:16.681312 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:16.681771 | orchestrator | Saturday 22 March 2025 23:16:16 +0000 (0:00:00.201) 0:00:58.377 ******** 2025-03-22 23:16:17.258570 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:17.258751 | orchestrator | 2025-03-22 23:16:17.258779 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:17.481336 | orchestrator | Saturday 22 March 2025 23:16:17 +0000 (0:00:00.577) 0:00:58.954 ******** 2025-03-22 23:16:17.481429 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:17.481572 | orchestrator | 2025-03-22 23:16:17.482151 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:17.482556 | orchestrator | Saturday 22 March 2025 23:16:17 +0000 (0:00:00.223) 0:00:59.178 ******** 2025-03-22 23:16:17.734635 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:17.734860 | orchestrator | 2025-03-22 23:16:17.736666 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:17.736855 | orchestrator | Saturday 22 March 2025 23:16:17 +0000 (0:00:00.252) 0:00:59.430 ******** 2025-03-22 23:16:17.964800 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:17.965819 | orchestrator | 2025-03-22 23:16:17.967407 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:17.968349 | orchestrator | Saturday 22 March 2025 23:16:17 +0000 (0:00:00.229) 0:00:59.660 ******** 2025-03-22 23:16:18.206158 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:18.206409 | orchestrator | 2025-03-22 23:16:18.207689 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:18.208672 | orchestrator | Saturday 22 March 2025 23:16:18 +0000 (0:00:00.242) 0:00:59.903 ******** 2025-03-22 23:16:18.687723 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_b23a3080-52c6-488b-8c5e-d7619a688699) 2025-03-22 23:16:18.688340 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_b23a3080-52c6-488b-8c5e-d7619a688699) 2025-03-22 23:16:18.688695 | orchestrator | 2025-03-22 23:16:18.689564 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:18.690090 | orchestrator | Saturday 22 March 2025 23:16:18 +0000 (0:00:00.480) 0:01:00.384 ******** 2025-03-22 23:16:19.211677 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_681c10dc-f1f8-4703-92fb-54cdfa604000) 2025-03-22 23:16:19.211830 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_681c10dc-f1f8-4703-92fb-54cdfa604000) 2025-03-22 23:16:19.212306 | orchestrator | 2025-03-22 23:16:19.212339 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:19.212475 | orchestrator | Saturday 22 March 2025 23:16:19 +0000 (0:00:00.522) 0:01:00.906 ******** 2025-03-22 23:16:19.741981 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_0f58cf45-cc6c-41c9-84ae-96e36ead1340) 2025-03-22 23:16:19.742531 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_0f58cf45-cc6c-41c9-84ae-96e36ead1340) 2025-03-22 23:16:19.742571 | orchestrator | 2025-03-22 23:16:19.743995 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:20.437012 | orchestrator | Saturday 22 March 2025 23:16:19 +0000 (0:00:00.530) 0:01:01.437 ******** 2025-03-22 23:16:20.437158 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_9647d6dd-7a49-4e0c-bf6a-16b92e91fe66) 2025-03-22 23:16:20.437225 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_9647d6dd-7a49-4e0c-bf6a-16b92e91fe66) 2025-03-22 23:16:20.437245 | orchestrator | 2025-03-22 23:16:20.437265 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-03-22 23:16:20.437662 | orchestrator | Saturday 22 March 2025 23:16:20 +0000 (0:00:00.692) 0:01:02.130 ******** 2025-03-22 23:16:21.267747 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-03-22 23:16:21.268513 | orchestrator | 2025-03-22 23:16:21.268999 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:21.269891 | orchestrator | Saturday 22 March 2025 23:16:21 +0000 (0:00:00.831) 0:01:02.962 ******** 2025-03-22 23:16:21.834252 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-03-22 23:16:21.836518 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-03-22 23:16:21.836624 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-03-22 23:16:21.838204 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-03-22 23:16:21.839638 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-03-22 23:16:21.840609 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-03-22 23:16:21.841740 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-03-22 23:16:21.842395 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-03-22 23:16:21.842797 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-03-22 23:16:21.843858 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-03-22 23:16:21.844206 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-03-22 23:16:21.844773 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-03-22 23:16:21.845241 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-03-22 23:16:21.846189 | orchestrator | 2025-03-22 23:16:21.847301 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:21.847909 | orchestrator | Saturday 22 March 2025 23:16:21 +0000 (0:00:00.568) 0:01:03.530 ******** 2025-03-22 23:16:22.075588 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:22.075771 | orchestrator | 2025-03-22 23:16:22.075800 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:22.306948 | orchestrator | Saturday 22 March 2025 23:16:22 +0000 (0:00:00.242) 0:01:03.772 ******** 2025-03-22 23:16:22.307029 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:22.308736 | orchestrator | 2025-03-22 23:16:22.308766 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:22.309219 | orchestrator | Saturday 22 March 2025 23:16:22 +0000 (0:00:00.225) 0:01:03.998 ******** 2025-03-22 23:16:22.521550 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:22.522074 | orchestrator | 2025-03-22 23:16:22.522695 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:22.522743 | orchestrator | Saturday 22 March 2025 23:16:22 +0000 (0:00:00.219) 0:01:04.217 ******** 2025-03-22 23:16:22.752941 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:22.753126 | orchestrator | 2025-03-22 23:16:22.754305 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:22.754565 | orchestrator | Saturday 22 March 2025 23:16:22 +0000 (0:00:00.231) 0:01:04.449 ******** 2025-03-22 23:16:22.987711 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:22.988349 | orchestrator | 2025-03-22 23:16:22.989473 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:22.989715 | orchestrator | Saturday 22 March 2025 23:16:22 +0000 (0:00:00.233) 0:01:04.682 ******** 2025-03-22 23:16:23.271825 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:23.273078 | orchestrator | 2025-03-22 23:16:23.274203 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:23.274959 | orchestrator | Saturday 22 March 2025 23:16:23 +0000 (0:00:00.284) 0:01:04.967 ******** 2025-03-22 23:16:23.509035 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:23.509151 | orchestrator | 2025-03-22 23:16:23.509942 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:23.510383 | orchestrator | Saturday 22 March 2025 23:16:23 +0000 (0:00:00.239) 0:01:05.206 ******** 2025-03-22 23:16:23.760162 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:23.761664 | orchestrator | 2025-03-22 23:16:23.762661 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:23.763458 | orchestrator | Saturday 22 March 2025 23:16:23 +0000 (0:00:00.249) 0:01:05.455 ******** 2025-03-22 23:16:25.035047 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-03-22 23:16:25.035226 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-03-22 23:16:25.035252 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-03-22 23:16:25.036181 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-03-22 23:16:25.036614 | orchestrator | 2025-03-22 23:16:25.037335 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:25.037697 | orchestrator | Saturday 22 March 2025 23:16:25 +0000 (0:00:01.271) 0:01:06.726 ******** 2025-03-22 23:16:25.281917 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:25.283084 | orchestrator | 2025-03-22 23:16:25.284287 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:25.284988 | orchestrator | Saturday 22 March 2025 23:16:25 +0000 (0:00:00.251) 0:01:06.978 ******** 2025-03-22 23:16:25.549706 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:25.549991 | orchestrator | 2025-03-22 23:16:25.550063 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:25.550239 | orchestrator | Saturday 22 March 2025 23:16:25 +0000 (0:00:00.267) 0:01:07.246 ******** 2025-03-22 23:16:25.773183 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:25.773428 | orchestrator | 2025-03-22 23:16:25.773785 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-03-22 23:16:25.773981 | orchestrator | Saturday 22 March 2025 23:16:25 +0000 (0:00:00.224) 0:01:07.470 ******** 2025-03-22 23:16:25.986784 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:25.987335 | orchestrator | 2025-03-22 23:16:25.988541 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-03-22 23:16:25.988969 | orchestrator | Saturday 22 March 2025 23:16:25 +0000 (0:00:00.211) 0:01:07.682 ******** 2025-03-22 23:16:26.123411 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:26.123561 | orchestrator | 2025-03-22 23:16:26.123807 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-03-22 23:16:26.124584 | orchestrator | Saturday 22 March 2025 23:16:26 +0000 (0:00:00.138) 0:01:07.820 ******** 2025-03-22 23:16:26.382775 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'b06d94ac-41ed-5dd0-ab65-3af10e523950'}}) 2025-03-22 23:16:26.383248 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'f9e87843-61b1-54ad-82c0-8d76e40ccf36'}}) 2025-03-22 23:16:26.384113 | orchestrator | 2025-03-22 23:16:26.385525 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-03-22 23:16:26.386317 | orchestrator | Saturday 22 March 2025 23:16:26 +0000 (0:00:00.259) 0:01:08.079 ******** 2025-03-22 23:16:28.570446 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'}) 2025-03-22 23:16:28.571134 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'}) 2025-03-22 23:16:28.572174 | orchestrator | 2025-03-22 23:16:28.575831 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-03-22 23:16:28.754806 | orchestrator | Saturday 22 March 2025 23:16:28 +0000 (0:00:02.186) 0:01:10.265 ******** 2025-03-22 23:16:28.754893 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:28.755266 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:28.755445 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:28.756164 | orchestrator | 2025-03-22 23:16:28.756558 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-03-22 23:16:28.757288 | orchestrator | Saturday 22 March 2025 23:16:28 +0000 (0:00:00.185) 0:01:10.451 ******** 2025-03-22 23:16:30.121374 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'}) 2025-03-22 23:16:30.122179 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'}) 2025-03-22 23:16:30.122223 | orchestrator | 2025-03-22 23:16:30.122284 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-03-22 23:16:30.122684 | orchestrator | Saturday 22 March 2025 23:16:30 +0000 (0:00:01.364) 0:01:11.816 ******** 2025-03-22 23:16:30.292870 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:30.294777 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:30.295626 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:30.299151 | orchestrator | 2025-03-22 23:16:30.299255 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-03-22 23:16:30.299310 | orchestrator | Saturday 22 March 2025 23:16:30 +0000 (0:00:00.172) 0:01:11.988 ******** 2025-03-22 23:16:30.456368 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:30.456710 | orchestrator | 2025-03-22 23:16:30.457230 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-03-22 23:16:30.457584 | orchestrator | Saturday 22 March 2025 23:16:30 +0000 (0:00:00.164) 0:01:12.153 ******** 2025-03-22 23:16:30.635711 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:30.636646 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:30.637407 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:30.638170 | orchestrator | 2025-03-22 23:16:30.638698 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-03-22 23:16:30.639316 | orchestrator | Saturday 22 March 2025 23:16:30 +0000 (0:00:00.178) 0:01:12.331 ******** 2025-03-22 23:16:30.808098 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:30.808814 | orchestrator | 2025-03-22 23:16:30.809701 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-03-22 23:16:30.810340 | orchestrator | Saturday 22 March 2025 23:16:30 +0000 (0:00:00.172) 0:01:12.504 ******** 2025-03-22 23:16:30.989614 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:30.989820 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:30.991963 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:30.992614 | orchestrator | 2025-03-22 23:16:30.994116 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-03-22 23:16:30.994990 | orchestrator | Saturday 22 March 2025 23:16:30 +0000 (0:00:00.177) 0:01:12.681 ******** 2025-03-22 23:16:31.135224 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:31.135390 | orchestrator | 2025-03-22 23:16:31.135848 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-03-22 23:16:31.136602 | orchestrator | Saturday 22 March 2025 23:16:31 +0000 (0:00:00.150) 0:01:12.831 ******** 2025-03-22 23:16:31.336794 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:31.337459 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:31.337659 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:31.338352 | orchestrator | 2025-03-22 23:16:31.339623 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-03-22 23:16:31.339902 | orchestrator | Saturday 22 March 2025 23:16:31 +0000 (0:00:00.202) 0:01:13.033 ******** 2025-03-22 23:16:31.486730 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:31.487430 | orchestrator | 2025-03-22 23:16:31.488122 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-03-22 23:16:31.489216 | orchestrator | Saturday 22 March 2025 23:16:31 +0000 (0:00:00.148) 0:01:13.182 ******** 2025-03-22 23:16:31.679451 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:31.680009 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:31.680126 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:31.680163 | orchestrator | 2025-03-22 23:16:31.680259 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-03-22 23:16:31.681473 | orchestrator | Saturday 22 March 2025 23:16:31 +0000 (0:00:00.193) 0:01:13.375 ******** 2025-03-22 23:16:31.843929 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:31.844107 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:31.845837 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:31.847742 | orchestrator | 2025-03-22 23:16:31.849624 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-03-22 23:16:32.243612 | orchestrator | Saturday 22 March 2025 23:16:31 +0000 (0:00:00.164) 0:01:13.540 ******** 2025-03-22 23:16:32.243724 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:32.248117 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:32.249293 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:32.249321 | orchestrator | 2025-03-22 23:16:32.249341 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-03-22 23:16:32.250293 | orchestrator | Saturday 22 March 2025 23:16:32 +0000 (0:00:00.399) 0:01:13.939 ******** 2025-03-22 23:16:32.423945 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:32.424589 | orchestrator | 2025-03-22 23:16:32.426098 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-03-22 23:16:32.426929 | orchestrator | Saturday 22 March 2025 23:16:32 +0000 (0:00:00.180) 0:01:14.120 ******** 2025-03-22 23:16:32.577841 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:32.578602 | orchestrator | 2025-03-22 23:16:32.579413 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-03-22 23:16:32.580729 | orchestrator | Saturday 22 March 2025 23:16:32 +0000 (0:00:00.153) 0:01:14.274 ******** 2025-03-22 23:16:32.765012 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:32.765960 | orchestrator | 2025-03-22 23:16:32.766720 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-03-22 23:16:32.767504 | orchestrator | Saturday 22 March 2025 23:16:32 +0000 (0:00:00.185) 0:01:14.459 ******** 2025-03-22 23:16:32.918753 | orchestrator | ok: [testbed-node-5] => { 2025-03-22 23:16:32.919601 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-03-22 23:16:32.921763 | orchestrator | } 2025-03-22 23:16:32.922786 | orchestrator | 2025-03-22 23:16:32.924111 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-03-22 23:16:32.925445 | orchestrator | Saturday 22 March 2025 23:16:32 +0000 (0:00:00.152) 0:01:14.611 ******** 2025-03-22 23:16:33.073883 | orchestrator | ok: [testbed-node-5] => { 2025-03-22 23:16:33.074779 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-03-22 23:16:33.076315 | orchestrator | } 2025-03-22 23:16:33.077254 | orchestrator | 2025-03-22 23:16:33.078310 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-03-22 23:16:33.078914 | orchestrator | Saturday 22 March 2025 23:16:33 +0000 (0:00:00.158) 0:01:14.770 ******** 2025-03-22 23:16:33.223936 | orchestrator | ok: [testbed-node-5] => { 2025-03-22 23:16:33.225413 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-03-22 23:16:33.227037 | orchestrator | } 2025-03-22 23:16:33.228401 | orchestrator | 2025-03-22 23:16:33.229038 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-03-22 23:16:33.229592 | orchestrator | Saturday 22 March 2025 23:16:33 +0000 (0:00:00.149) 0:01:14.919 ******** 2025-03-22 23:16:33.830730 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:33.832995 | orchestrator | 2025-03-22 23:16:33.834409 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-03-22 23:16:33.834450 | orchestrator | Saturday 22 March 2025 23:16:33 +0000 (0:00:00.605) 0:01:15.525 ******** 2025-03-22 23:16:34.386242 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:34.386375 | orchestrator | 2025-03-22 23:16:34.386396 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-03-22 23:16:34.386415 | orchestrator | Saturday 22 March 2025 23:16:34 +0000 (0:00:00.557) 0:01:16.082 ******** 2025-03-22 23:16:34.907790 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:34.908577 | orchestrator | 2025-03-22 23:16:34.909324 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-03-22 23:16:34.910912 | orchestrator | Saturday 22 March 2025 23:16:34 +0000 (0:00:00.516) 0:01:16.599 ******** 2025-03-22 23:16:35.293184 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:35.293649 | orchestrator | 2025-03-22 23:16:35.294660 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-03-22 23:16:35.295431 | orchestrator | Saturday 22 March 2025 23:16:35 +0000 (0:00:00.390) 0:01:16.989 ******** 2025-03-22 23:16:35.457906 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:35.458590 | orchestrator | 2025-03-22 23:16:35.459105 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-03-22 23:16:35.459589 | orchestrator | Saturday 22 March 2025 23:16:35 +0000 (0:00:00.164) 0:01:17.154 ******** 2025-03-22 23:16:35.579522 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:35.579898 | orchestrator | 2025-03-22 23:16:35.580599 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-03-22 23:16:35.581596 | orchestrator | Saturday 22 March 2025 23:16:35 +0000 (0:00:00.122) 0:01:17.276 ******** 2025-03-22 23:16:35.731898 | orchestrator | ok: [testbed-node-5] => { 2025-03-22 23:16:35.732060 | orchestrator |  "vgs_report": { 2025-03-22 23:16:35.733274 | orchestrator |  "vg": [] 2025-03-22 23:16:35.734013 | orchestrator |  } 2025-03-22 23:16:35.734777 | orchestrator | } 2025-03-22 23:16:35.735632 | orchestrator | 2025-03-22 23:16:35.736667 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-03-22 23:16:35.737008 | orchestrator | Saturday 22 March 2025 23:16:35 +0000 (0:00:00.151) 0:01:17.428 ******** 2025-03-22 23:16:35.891716 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:35.891843 | orchestrator | 2025-03-22 23:16:35.893528 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-03-22 23:16:35.894567 | orchestrator | Saturday 22 March 2025 23:16:35 +0000 (0:00:00.159) 0:01:17.587 ******** 2025-03-22 23:16:36.056300 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:36.056645 | orchestrator | 2025-03-22 23:16:36.057894 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-03-22 23:16:36.058619 | orchestrator | Saturday 22 March 2025 23:16:36 +0000 (0:00:00.163) 0:01:17.751 ******** 2025-03-22 23:16:36.213174 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:36.214418 | orchestrator | 2025-03-22 23:16:36.215696 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-03-22 23:16:36.217322 | orchestrator | Saturday 22 March 2025 23:16:36 +0000 (0:00:00.157) 0:01:17.908 ******** 2025-03-22 23:16:36.372791 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:36.372931 | orchestrator | 2025-03-22 23:16:36.373592 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-03-22 23:16:36.374239 | orchestrator | Saturday 22 March 2025 23:16:36 +0000 (0:00:00.161) 0:01:18.070 ******** 2025-03-22 23:16:36.525981 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:36.527233 | orchestrator | 2025-03-22 23:16:36.530861 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-03-22 23:16:36.531443 | orchestrator | Saturday 22 March 2025 23:16:36 +0000 (0:00:00.152) 0:01:18.222 ******** 2025-03-22 23:16:36.666690 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:36.668068 | orchestrator | 2025-03-22 23:16:36.669218 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-03-22 23:16:36.670230 | orchestrator | Saturday 22 March 2025 23:16:36 +0000 (0:00:00.141) 0:01:18.364 ******** 2025-03-22 23:16:36.806432 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:36.808651 | orchestrator | 2025-03-22 23:16:36.809465 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-03-22 23:16:36.810384 | orchestrator | Saturday 22 March 2025 23:16:36 +0000 (0:00:00.138) 0:01:18.502 ******** 2025-03-22 23:16:36.958759 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:36.958885 | orchestrator | 2025-03-22 23:16:36.959857 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-03-22 23:16:36.960537 | orchestrator | Saturday 22 March 2025 23:16:36 +0000 (0:00:00.151) 0:01:18.653 ******** 2025-03-22 23:16:37.437267 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:37.438176 | orchestrator | 2025-03-22 23:16:37.438392 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-03-22 23:16:37.439542 | orchestrator | Saturday 22 March 2025 23:16:37 +0000 (0:00:00.480) 0:01:19.133 ******** 2025-03-22 23:16:37.605619 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:37.606543 | orchestrator | 2025-03-22 23:16:37.607418 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-03-22 23:16:37.608267 | orchestrator | Saturday 22 March 2025 23:16:37 +0000 (0:00:00.168) 0:01:19.302 ******** 2025-03-22 23:16:37.757983 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:37.758929 | orchestrator | 2025-03-22 23:16:37.759205 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-03-22 23:16:37.759754 | orchestrator | Saturday 22 March 2025 23:16:37 +0000 (0:00:00.151) 0:01:19.453 ******** 2025-03-22 23:16:37.935270 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:37.936711 | orchestrator | 2025-03-22 23:16:37.938305 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-03-22 23:16:37.939294 | orchestrator | Saturday 22 March 2025 23:16:37 +0000 (0:00:00.178) 0:01:19.632 ******** 2025-03-22 23:16:38.093627 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:38.094852 | orchestrator | 2025-03-22 23:16:38.095616 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-03-22 23:16:38.096750 | orchestrator | Saturday 22 March 2025 23:16:38 +0000 (0:00:00.157) 0:01:19.790 ******** 2025-03-22 23:16:38.239601 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:38.240590 | orchestrator | 2025-03-22 23:16:38.241670 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-03-22 23:16:38.244035 | orchestrator | Saturday 22 March 2025 23:16:38 +0000 (0:00:00.145) 0:01:19.935 ******** 2025-03-22 23:16:38.441319 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:38.442392 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:38.443886 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:38.444913 | orchestrator | 2025-03-22 23:16:38.445872 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-03-22 23:16:38.447778 | orchestrator | Saturday 22 March 2025 23:16:38 +0000 (0:00:00.201) 0:01:20.137 ******** 2025-03-22 23:16:38.615747 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:38.617338 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:38.618404 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:38.620686 | orchestrator | 2025-03-22 23:16:38.860464 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-03-22 23:16:38.860589 | orchestrator | Saturday 22 March 2025 23:16:38 +0000 (0:00:00.174) 0:01:20.312 ******** 2025-03-22 23:16:38.860618 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:38.861617 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:38.863884 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:38.864645 | orchestrator | 2025-03-22 23:16:38.865555 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-03-22 23:16:38.867366 | orchestrator | Saturday 22 March 2025 23:16:38 +0000 (0:00:00.242) 0:01:20.554 ******** 2025-03-22 23:16:39.071163 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:39.071328 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:39.074659 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:39.075558 | orchestrator | 2025-03-22 23:16:39.076085 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-03-22 23:16:39.076638 | orchestrator | Saturday 22 March 2025 23:16:39 +0000 (0:00:00.204) 0:01:20.759 ******** 2025-03-22 23:16:39.255428 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:39.256303 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:39.257199 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:39.257228 | orchestrator | 2025-03-22 23:16:39.258308 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-03-22 23:16:39.258824 | orchestrator | Saturday 22 March 2025 23:16:39 +0000 (0:00:00.192) 0:01:20.952 ******** 2025-03-22 23:16:39.472644 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:39.473783 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:39.473809 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:39.473828 | orchestrator | 2025-03-22 23:16:39.474385 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-03-22 23:16:39.474835 | orchestrator | Saturday 22 March 2025 23:16:39 +0000 (0:00:00.215) 0:01:21.167 ******** 2025-03-22 23:16:39.889594 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:39.889736 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:39.891949 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:39.891987 | orchestrator | 2025-03-22 23:16:39.892438 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-03-22 23:16:39.893163 | orchestrator | Saturday 22 March 2025 23:16:39 +0000 (0:00:00.418) 0:01:21.585 ******** 2025-03-22 23:16:40.079934 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:40.080905 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:40.082134 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:40.085053 | orchestrator | 2025-03-22 23:16:40.649167 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-03-22 23:16:40.649274 | orchestrator | Saturday 22 March 2025 23:16:40 +0000 (0:00:00.190) 0:01:21.776 ******** 2025-03-22 23:16:40.649306 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:40.649432 | orchestrator | 2025-03-22 23:16:40.650899 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-03-22 23:16:40.652044 | orchestrator | Saturday 22 March 2025 23:16:40 +0000 (0:00:00.568) 0:01:22.344 ******** 2025-03-22 23:16:41.191801 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:41.192441 | orchestrator | 2025-03-22 23:16:41.192600 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-03-22 23:16:41.193274 | orchestrator | Saturday 22 March 2025 23:16:41 +0000 (0:00:00.542) 0:01:22.887 ******** 2025-03-22 23:16:41.365886 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:41.366375 | orchestrator | 2025-03-22 23:16:41.366538 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-03-22 23:16:41.366576 | orchestrator | Saturday 22 March 2025 23:16:41 +0000 (0:00:00.173) 0:01:23.061 ******** 2025-03-22 23:16:41.566406 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'vg_name': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'}) 2025-03-22 23:16:41.568132 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'vg_name': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'}) 2025-03-22 23:16:41.570107 | orchestrator | 2025-03-22 23:16:41.570143 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-03-22 23:16:41.570600 | orchestrator | Saturday 22 March 2025 23:16:41 +0000 (0:00:00.200) 0:01:23.262 ******** 2025-03-22 23:16:41.762858 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:41.763042 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:41.763133 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:41.763644 | orchestrator | 2025-03-22 23:16:41.764001 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-03-22 23:16:41.764672 | orchestrator | Saturday 22 March 2025 23:16:41 +0000 (0:00:00.198) 0:01:23.460 ******** 2025-03-22 23:16:41.968840 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:41.969760 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:41.970123 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:41.971559 | orchestrator | 2025-03-22 23:16:41.972177 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-03-22 23:16:41.972856 | orchestrator | Saturday 22 March 2025 23:16:41 +0000 (0:00:00.203) 0:01:23.664 ******** 2025-03-22 23:16:42.170254 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950', 'data_vg': 'ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950'})  2025-03-22 23:16:42.171475 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36', 'data_vg': 'ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36'})  2025-03-22 23:16:42.172222 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:42.173177 | orchestrator | 2025-03-22 23:16:42.175998 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-03-22 23:16:42.839611 | orchestrator | Saturday 22 March 2025 23:16:42 +0000 (0:00:00.201) 0:01:23.865 ******** 2025-03-22 23:16:42.839733 | orchestrator | ok: [testbed-node-5] => { 2025-03-22 23:16:42.839814 | orchestrator |  "lvm_report": { 2025-03-22 23:16:42.840551 | orchestrator |  "lv": [ 2025-03-22 23:16:42.840583 | orchestrator |  { 2025-03-22 23:16:42.841232 | orchestrator |  "lv_name": "osd-block-b06d94ac-41ed-5dd0-ab65-3af10e523950", 2025-03-22 23:16:42.841688 | orchestrator |  "vg_name": "ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950" 2025-03-22 23:16:42.843212 | orchestrator |  }, 2025-03-22 23:16:42.843647 | orchestrator |  { 2025-03-22 23:16:42.843704 | orchestrator |  "lv_name": "osd-block-f9e87843-61b1-54ad-82c0-8d76e40ccf36", 2025-03-22 23:16:42.844132 | orchestrator |  "vg_name": "ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36" 2025-03-22 23:16:42.844880 | orchestrator |  } 2025-03-22 23:16:42.845284 | orchestrator |  ], 2025-03-22 23:16:42.845916 | orchestrator |  "pv": [ 2025-03-22 23:16:42.846209 | orchestrator |  { 2025-03-22 23:16:42.846900 | orchestrator |  "pv_name": "/dev/sdb", 2025-03-22 23:16:42.847199 | orchestrator |  "vg_name": "ceph-b06d94ac-41ed-5dd0-ab65-3af10e523950" 2025-03-22 23:16:42.847634 | orchestrator |  }, 2025-03-22 23:16:42.848023 | orchestrator |  { 2025-03-22 23:16:42.848523 | orchestrator |  "pv_name": "/dev/sdc", 2025-03-22 23:16:42.849629 | orchestrator |  "vg_name": "ceph-f9e87843-61b1-54ad-82c0-8d76e40ccf36" 2025-03-22 23:16:42.850245 | orchestrator |  } 2025-03-22 23:16:42.850699 | orchestrator |  ] 2025-03-22 23:16:42.851675 | orchestrator |  } 2025-03-22 23:16:42.852875 | orchestrator | } 2025-03-22 23:16:42.853271 | orchestrator | 2025-03-22 23:16:42.853735 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:16:42.854143 | orchestrator | 2025-03-22 23:16:42 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 23:16:42.854478 | orchestrator | 2025-03-22 23:16:42 | INFO  | Please wait and do not abort execution. 2025-03-22 23:16:42.855037 | orchestrator | testbed-node-3 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-03-22 23:16:42.855731 | orchestrator | testbed-node-4 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-03-22 23:16:42.856080 | orchestrator | testbed-node-5 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-03-22 23:16:42.856615 | orchestrator | 2025-03-22 23:16:42.856832 | orchestrator | 2025-03-22 23:16:42.857288 | orchestrator | 2025-03-22 23:16:42.857973 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:16:42.858335 | orchestrator | Saturday 22 March 2025 23:16:42 +0000 (0:00:00.668) 0:01:24.534 ******** 2025-03-22 23:16:42.858736 | orchestrator | =============================================================================== 2025-03-22 23:16:42.859183 | orchestrator | Create block VGs -------------------------------------------------------- 6.76s 2025-03-22 23:16:42.859657 | orchestrator | Create block LVs -------------------------------------------------------- 4.29s 2025-03-22 23:16:42.860652 | orchestrator | Print LVM report data --------------------------------------------------- 2.71s 2025-03-22 23:16:42.861329 | orchestrator | Get list of Ceph LVs with associated VGs -------------------------------- 1.91s 2025-03-22 23:16:42.861360 | orchestrator | Add known links to the list of available block devices ------------------ 1.90s 2025-03-22 23:16:42.861570 | orchestrator | Gather DB VGs with total and available size in bytes -------------------- 1.89s 2025-03-22 23:16:42.861978 | orchestrator | Gather WAL VGs with total and available size in bytes ------------------- 1.68s 2025-03-22 23:16:42.863257 | orchestrator | Add known partitions to the list of available block devices ------------- 1.65s 2025-03-22 23:16:42.864569 | orchestrator | Get list of Ceph PVs with associated VGs -------------------------------- 1.65s 2025-03-22 23:16:42.865746 | orchestrator | Gather DB+WAL VGs with total and available size in bytes ---------------- 1.61s 2025-03-22 23:16:42.866244 | orchestrator | Add known partitions to the list of available block devices ------------- 1.27s 2025-03-22 23:16:42.867175 | orchestrator | Add known partitions to the list of available block devices ------------- 0.99s 2025-03-22 23:16:42.867598 | orchestrator | Create dict of block VGs -> PVs from ceph_osd_devices ------------------- 0.94s 2025-03-22 23:16:42.868310 | orchestrator | Create list of VG/LV names ---------------------------------------------- 0.86s 2025-03-22 23:16:42.868575 | orchestrator | Create DB LVs for ceph_db_devices --------------------------------------- 0.85s 2025-03-22 23:16:42.869246 | orchestrator | Add known links to the list of available block devices ------------------ 0.83s 2025-03-22 23:16:42.870220 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.82s 2025-03-22 23:16:42.870311 | orchestrator | Get initial list of available block devices ----------------------------- 0.81s 2025-03-22 23:16:42.870785 | orchestrator | Print size needed for WAL LVs on ceph_db_wal_devices -------------------- 0.78s 2025-03-22 23:16:42.871005 | orchestrator | Add known partitions to the list of available block devices ------------- 0.78s 2025-03-22 23:16:45.166010 | orchestrator | 2025-03-22 23:16:45 | INFO  | Task a0a5a571-696f-4e9d-8a13-c33d4ea4fe0f (facts) was prepared for execution. 2025-03-22 23:16:48.785625 | orchestrator | 2025-03-22 23:16:45 | INFO  | It takes a moment until task a0a5a571-696f-4e9d-8a13-c33d4ea4fe0f (facts) has been started and output is visible here. 2025-03-22 23:16:48.785781 | orchestrator | 2025-03-22 23:16:48.786355 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-03-22 23:16:48.787114 | orchestrator | 2025-03-22 23:16:48.788115 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-03-22 23:16:48.791136 | orchestrator | Saturday 22 March 2025 23:16:48 +0000 (0:00:00.229) 0:00:00.229 ******** 2025-03-22 23:16:50.457515 | orchestrator | ok: [testbed-manager] 2025-03-22 23:16:50.458531 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:16:50.461432 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:16:50.462411 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:16:50.463660 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:50.464635 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:16:50.467705 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:50.468382 | orchestrator | 2025-03-22 23:16:50.468755 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-03-22 23:16:50.469464 | orchestrator | Saturday 22 March 2025 23:16:50 +0000 (0:00:01.666) 0:00:01.895 ******** 2025-03-22 23:16:50.651173 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:16:50.770998 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:16:50.871395 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:16:50.956047 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:16:51.047223 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:16:51.847724 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:51.847950 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:51.848349 | orchestrator | 2025-03-22 23:16:51.849982 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-03-22 23:16:51.851379 | orchestrator | 2025-03-22 23:16:56.559175 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-03-22 23:16:56.559308 | orchestrator | Saturday 22 March 2025 23:16:51 +0000 (0:00:01.396) 0:00:03.291 ******** 2025-03-22 23:16:56.559344 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:16:56.564262 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:16:56.564380 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:16:56.564417 | orchestrator | ok: [testbed-manager] 2025-03-22 23:16:56.564437 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:16:56.566116 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:16:56.566596 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:16:56.568262 | orchestrator | 2025-03-22 23:16:56.568959 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-03-22 23:16:56.569792 | orchestrator | 2025-03-22 23:16:56.570409 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-03-22 23:16:56.571287 | orchestrator | Saturday 22 March 2025 23:16:56 +0000 (0:00:04.711) 0:00:08.003 ******** 2025-03-22 23:16:56.923659 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:16:57.027626 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:16:57.111736 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:16:57.199123 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:16:57.296104 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:16:57.339578 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:16:57.340116 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:16:57.340867 | orchestrator | 2025-03-22 23:16:57.341673 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:16:57.342105 | orchestrator | 2025-03-22 23:16:57 | INFO  | Play has been completed. There may now be a delay until all logs have been written. 2025-03-22 23:16:57.342652 | orchestrator | 2025-03-22 23:16:57 | INFO  | Please wait and do not abort execution. 2025-03-22 23:16:57.343067 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:16:57.343604 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:16:57.344541 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:16:57.345011 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:16:57.345253 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:16:57.345538 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:16:57.345852 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:16:57.346173 | orchestrator | 2025-03-22 23:16:57.346464 | orchestrator | Saturday 22 March 2025 23:16:57 +0000 (0:00:00.782) 0:00:08.785 ******** 2025-03-22 23:16:57.346798 | orchestrator | =============================================================================== 2025-03-22 23:16:57.346998 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.71s 2025-03-22 23:16:57.347376 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.67s 2025-03-22 23:16:57.347764 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.40s 2025-03-22 23:16:57.348198 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.78s 2025-03-22 23:16:58.085062 | orchestrator | 2025-03-22 23:16:58.087445 | orchestrator | --> DEPLOY IN A NUTSHELL -- START -- Sat Mar 22 23:16:58 UTC 2025 2025-03-22 23:16:59.711105 | orchestrator | 2025-03-22 23:16:59.711230 | orchestrator | 2025-03-22 23:16:59 | INFO  | Collection nutshell is prepared for execution 2025-03-22 23:16:59.716659 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [0] - dotfiles 2025-03-22 23:16:59.716698 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [0] - homer 2025-03-22 23:16:59.718344 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [0] - netdata 2025-03-22 23:16:59.718374 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [0] - openstackclient 2025-03-22 23:16:59.718391 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [0] - phpmyadmin 2025-03-22 23:16:59.718406 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [0] - common 2025-03-22 23:16:59.718427 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [1] -- loadbalancer 2025-03-22 23:16:59.718554 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [2] --- opensearch 2025-03-22 23:16:59.719410 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [2] --- mariadb-ng 2025-03-22 23:16:59.719438 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [3] ---- horizon 2025-03-22 23:16:59.719455 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [3] ---- keystone 2025-03-22 23:16:59.719471 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [4] ----- neutron 2025-03-22 23:16:59.719510 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [5] ------ wait-for-nova 2025-03-22 23:16:59.719599 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [5] ------ octavia 2025-03-22 23:16:59.719622 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [4] ----- barbican 2025-03-22 23:16:59.719712 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [4] ----- designate 2025-03-22 23:16:59.719730 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [4] ----- ironic 2025-03-22 23:16:59.719773 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [4] ----- placement 2025-03-22 23:16:59.719789 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [4] ----- magnum 2025-03-22 23:16:59.719808 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [1] -- openvswitch 2025-03-22 23:16:59.719894 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [2] --- ovn 2025-03-22 23:16:59.719916 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [1] -- memcached 2025-03-22 23:16:59.719931 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [1] -- redis 2025-03-22 23:16:59.719951 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [1] -- rabbitmq-ng 2025-03-22 23:16:59.720002 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [0] - kubernetes 2025-03-22 23:16:59.720026 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [1] -- kubeconfig 2025-03-22 23:16:59.720115 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [1] -- copy-kubeconfig 2025-03-22 23:16:59.720376 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [0] - ceph 2025-03-22 23:16:59.722743 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [1] -- ceph-pools 2025-03-22 23:16:59.722818 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [2] --- copy-ceph-keys 2025-03-22 23:16:59.722836 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [3] ---- cephclient 2025-03-22 23:16:59.722851 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [4] ----- ceph-bootstrap-dashboard 2025-03-22 23:16:59.722866 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [4] ----- wait-for-keystone 2025-03-22 23:16:59.722880 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [5] ------ kolla-ceph-rgw 2025-03-22 23:16:59.722899 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [5] ------ glance 2025-03-22 23:16:59.723172 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [5] ------ cinder 2025-03-22 23:16:59.723198 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [5] ------ nova 2025-03-22 23:16:59.723218 | orchestrator | 2025-03-22 23:16:59 | INFO  | A [4] ----- prometheus 2025-03-22 23:16:59.893727 | orchestrator | 2025-03-22 23:16:59 | INFO  | D [5] ------ grafana 2025-03-22 23:16:59.893782 | orchestrator | 2025-03-22 23:16:59 | INFO  | All tasks of the collection nutshell are prepared for execution 2025-03-22 23:17:02.107431 | orchestrator | 2025-03-22 23:16:59 | INFO  | Tasks are running in the background 2025-03-22 23:17:02.107600 | orchestrator | 2025-03-22 23:17:02 | INFO  | No task IDs specified, wait for all currently running tasks 2025-03-22 23:17:04.233306 | orchestrator | 2025-03-22 23:17:04 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:04.233554 | orchestrator | 2025-03-22 23:17:04 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:04.237682 | orchestrator | 2025-03-22 23:17:04 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:04.238536 | orchestrator | 2025-03-22 23:17:04 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:04.242410 | orchestrator | 2025-03-22 23:17:04 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:04.243114 | orchestrator | 2025-03-22 23:17:04 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:04.243248 | orchestrator | 2025-03-22 23:17:04 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:07.295767 | orchestrator | 2025-03-22 23:17:07 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:07.297198 | orchestrator | 2025-03-22 23:17:07 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:07.297231 | orchestrator | 2025-03-22 23:17:07 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:07.297245 | orchestrator | 2025-03-22 23:17:07 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:07.297266 | orchestrator | 2025-03-22 23:17:07 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:07.297787 | orchestrator | 2025-03-22 23:17:07 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:07.300402 | orchestrator | 2025-03-22 23:17:07 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:10.381339 | orchestrator | 2025-03-22 23:17:10 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:10.385362 | orchestrator | 2025-03-22 23:17:10 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:10.386148 | orchestrator | 2025-03-22 23:17:10 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:10.388980 | orchestrator | 2025-03-22 23:17:10 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:10.389887 | orchestrator | 2025-03-22 23:17:10 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:10.391158 | orchestrator | 2025-03-22 23:17:10 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:10.391362 | orchestrator | 2025-03-22 23:17:10 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:13.482650 | orchestrator | 2025-03-22 23:17:13 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:13.482840 | orchestrator | 2025-03-22 23:17:13 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:13.485738 | orchestrator | 2025-03-22 23:17:13 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:13.488129 | orchestrator | 2025-03-22 23:17:13 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:13.489096 | orchestrator | 2025-03-22 23:17:13 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:13.489627 | orchestrator | 2025-03-22 23:17:13 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:13.489694 | orchestrator | 2025-03-22 23:17:13 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:16.548480 | orchestrator | 2025-03-22 23:17:16 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:16.549001 | orchestrator | 2025-03-22 23:17:16 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:16.549920 | orchestrator | 2025-03-22 23:17:16 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:16.553293 | orchestrator | 2025-03-22 23:17:16 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:16.554083 | orchestrator | 2025-03-22 23:17:16 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:16.554617 | orchestrator | 2025-03-22 23:17:16 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:16.555013 | orchestrator | 2025-03-22 23:17:16 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:19.721680 | orchestrator | 2025-03-22 23:17:19 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:19.729362 | orchestrator | 2025-03-22 23:17:19 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:19.743818 | orchestrator | 2025-03-22 23:17:19 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:19.745707 | orchestrator | 2025-03-22 23:17:19 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:19.749584 | orchestrator | 2025-03-22 23:17:19 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:19.752259 | orchestrator | 2025-03-22 23:17:19 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:22.839895 | orchestrator | 2025-03-22 23:17:19 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:22.840029 | orchestrator | 2025-03-22 23:17:22 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:22.844289 | orchestrator | 2025-03-22 23:17:22 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:22.844324 | orchestrator | 2025-03-22 23:17:22 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:22.853558 | orchestrator | 2025-03-22 23:17:22 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:22.858792 | orchestrator | 2025-03-22 23:17:22 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:22.860020 | orchestrator | 2025-03-22 23:17:22 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:25.928466 | orchestrator | 2025-03-22 23:17:22 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:25.928640 | orchestrator | 2025-03-22 23:17:25 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:25.929617 | orchestrator | 2025-03-22 23:17:25 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:25.933456 | orchestrator | 2025-03-22 23:17:25 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state STARTED 2025-03-22 23:17:25.936041 | orchestrator | 2025-03-22 23:17:25 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:25.937108 | orchestrator | 2025-03-22 23:17:25 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:25.940754 | orchestrator | 2025-03-22 23:17:25 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:28.991877 | orchestrator | 2025-03-22 23:17:25 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:28.992004 | orchestrator | 2025-03-22 23:17:28 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:28.992153 | orchestrator | 2025-03-22 23:17:28 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:28.992174 | orchestrator | 2025-03-22 23:17:28 | INFO  | Task d9715232-87ab-4ce9-8d53-726a51f445dc is in state SUCCESS 2025-03-22 23:17:28.992195 | orchestrator | 2025-03-22 23:17:28.992256 | orchestrator | PLAY [Apply role geerlingguy.dotfiles] ***************************************** 2025-03-22 23:17:28.992273 | orchestrator | 2025-03-22 23:17:28.992288 | orchestrator | TASK [geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally.] **** 2025-03-22 23:17:28.992302 | orchestrator | Saturday 22 March 2025 23:17:10 +0000 (0:00:00.412) 0:00:00.412 ******** 2025-03-22 23:17:28.992340 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:17:28.992356 | orchestrator | changed: [testbed-manager] 2025-03-22 23:17:28.992370 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:17:28.992384 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:17:28.992398 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:17:28.992412 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:17:28.992426 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:17:28.992439 | orchestrator | 2025-03-22 23:17:28.992454 | orchestrator | TASK [geerlingguy.dotfiles : Ensure all configured dotfiles are links.] ******** 2025-03-22 23:17:28.992468 | orchestrator | Saturday 22 March 2025 23:17:14 +0000 (0:00:04.220) 0:00:04.632 ******** 2025-03-22 23:17:28.992482 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2025-03-22 23:17:28.992520 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2025-03-22 23:17:28.992560 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2025-03-22 23:17:28.992576 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2025-03-22 23:17:28.992590 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2025-03-22 23:17:28.992604 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2025-03-22 23:17:28.992618 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2025-03-22 23:17:28.992632 | orchestrator | 2025-03-22 23:17:28.992647 | orchestrator | TASK [geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked.] *** 2025-03-22 23:17:28.992661 | orchestrator | Saturday 22 March 2025 23:17:18 +0000 (0:00:03.851) 0:00:08.484 ******** 2025-03-22 23:17:28.992678 | orchestrator | ok: [testbed-node-0] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-22 23:17:15.933992', 'end': '2025-03-22 23:17:15.940272', 'delta': '0:00:00.006280', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-22 23:17:28.992702 | orchestrator | ok: [testbed-manager] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-22 23:17:15.847571', 'end': '2025-03-22 23:17:15.854561', 'delta': '0:00:00.006990', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-22 23:17:28.992718 | orchestrator | ok: [testbed-node-1] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-22 23:17:16.000659', 'end': '2025-03-22 23:17:16.007783', 'delta': '0:00:00.007124', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-22 23:17:28.992764 | orchestrator | ok: [testbed-node-2] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-22 23:17:16.490168', 'end': '2025-03-22 23:17:16.499434', 'delta': '0:00:00.009266', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-22 23:17:28.992782 | orchestrator | ok: [testbed-node-3] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-22 23:17:16.655499', 'end': '2025-03-22 23:17:16.665601', 'delta': '0:00:00.010102', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-22 23:17:28.992798 | orchestrator | ok: [testbed-node-4] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-22 23:17:17.184685', 'end': '2025-03-22 23:17:17.191664', 'delta': '0:00:00.006979', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-22 23:17:28.992825 | orchestrator | ok: [testbed-node-5] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-03-22 23:17:18.005664', 'end': '2025-03-22 23:17:18.015548', 'delta': '0:00:00.009884', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-03-22 23:17:28.992841 | orchestrator | 2025-03-22 23:17:28.992856 | orchestrator | TASK [geerlingguy.dotfiles : Link dotfiles into home folder.] ****************** 2025-03-22 23:17:28.992872 | orchestrator | Saturday 22 March 2025 23:17:23 +0000 (0:00:05.123) 0:00:13.608 ******** 2025-03-22 23:17:28.992887 | orchestrator | changed: [testbed-manager] => (item=.tmux.conf) 2025-03-22 23:17:28.992901 | orchestrator | changed: [testbed-node-0] => (item=.tmux.conf) 2025-03-22 23:17:28.992915 | orchestrator | changed: [testbed-node-1] => (item=.tmux.conf) 2025-03-22 23:17:28.992928 | orchestrator | changed: [testbed-node-2] => (item=.tmux.conf) 2025-03-22 23:17:28.992942 | orchestrator | changed: [testbed-node-3] => (item=.tmux.conf) 2025-03-22 23:17:28.992962 | orchestrator | changed: [testbed-node-4] => (item=.tmux.conf) 2025-03-22 23:17:28.992976 | orchestrator | changed: [testbed-node-5] => (item=.tmux.conf) 2025-03-22 23:17:28.992990 | orchestrator | 2025-03-22 23:17:28.993004 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:17:28.993018 | orchestrator | testbed-manager : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:17:28.993033 | orchestrator | testbed-node-0 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:17:28.993047 | orchestrator | testbed-node-1 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:17:28.993067 | orchestrator | testbed-node-2 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:17:28.994650 | orchestrator | testbed-node-3 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:17:28.994678 | orchestrator | testbed-node-4 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:17:28.994693 | orchestrator | testbed-node-5 : ok=4  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:17:28.994707 | orchestrator | 2025-03-22 23:17:28.994722 | orchestrator | Saturday 22 March 2025 23:17:27 +0000 (0:00:03.853) 0:00:17.461 ******** 2025-03-22 23:17:28.994736 | orchestrator | =============================================================================== 2025-03-22 23:17:28.994750 | orchestrator | geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked. --- 5.12s 2025-03-22 23:17:28.994765 | orchestrator | geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally. ---- 4.22s 2025-03-22 23:17:28.994779 | orchestrator | geerlingguy.dotfiles : Link dotfiles into home folder. ------------------ 3.85s 2025-03-22 23:17:28.994793 | orchestrator | geerlingguy.dotfiles : Ensure all configured dotfiles are links. -------- 3.85s 2025-03-22 23:17:28.994812 | orchestrator | 2025-03-22 23:17:28 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:32.089952 | orchestrator | 2025-03-22 23:17:28 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:32.090122 | orchestrator | 2025-03-22 23:17:28 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:32.090146 | orchestrator | 2025-03-22 23:17:28 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:32.090181 | orchestrator | 2025-03-22 23:17:32 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:32.090597 | orchestrator | 2025-03-22 23:17:32 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:32.090636 | orchestrator | 2025-03-22 23:17:32 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:32.094685 | orchestrator | 2025-03-22 23:17:32 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:32.095574 | orchestrator | 2025-03-22 23:17:32 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:32.102402 | orchestrator | 2025-03-22 23:17:32 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:35.253833 | orchestrator | 2025-03-22 23:17:32 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:35.253969 | orchestrator | 2025-03-22 23:17:35 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:35.260084 | orchestrator | 2025-03-22 23:17:35 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:35.260160 | orchestrator | 2025-03-22 23:17:35 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:35.264247 | orchestrator | 2025-03-22 23:17:35 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:35.265740 | orchestrator | 2025-03-22 23:17:35 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:35.267189 | orchestrator | 2025-03-22 23:17:35 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:38.384170 | orchestrator | 2025-03-22 23:17:35 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:38.384291 | orchestrator | 2025-03-22 23:17:38 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:38.386195 | orchestrator | 2025-03-22 23:17:38 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:38.393783 | orchestrator | 2025-03-22 23:17:38 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:38.405628 | orchestrator | 2025-03-22 23:17:38 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:38.408871 | orchestrator | 2025-03-22 23:17:38 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:38.408919 | orchestrator | 2025-03-22 23:17:38 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:41.506185 | orchestrator | 2025-03-22 23:17:38 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:41.506298 | orchestrator | 2025-03-22 23:17:41 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:41.516886 | orchestrator | 2025-03-22 23:17:41 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:41.516916 | orchestrator | 2025-03-22 23:17:41 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:41.519134 | orchestrator | 2025-03-22 23:17:41 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:41.524229 | orchestrator | 2025-03-22 23:17:41 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:41.532936 | orchestrator | 2025-03-22 23:17:41 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:41.537287 | orchestrator | 2025-03-22 23:17:41 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:44.612881 | orchestrator | 2025-03-22 23:17:44 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:44.613341 | orchestrator | 2025-03-22 23:17:44 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:44.614551 | orchestrator | 2025-03-22 23:17:44 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:44.615688 | orchestrator | 2025-03-22 23:17:44 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:44.619577 | orchestrator | 2025-03-22 23:17:44 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:47.702145 | orchestrator | 2025-03-22 23:17:44 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:47.702249 | orchestrator | 2025-03-22 23:17:44 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:47.702283 | orchestrator | 2025-03-22 23:17:47 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:47.715488 | orchestrator | 2025-03-22 23:17:47 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:47.726189 | orchestrator | 2025-03-22 23:17:47 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:47.742564 | orchestrator | 2025-03-22 23:17:47 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:47.745969 | orchestrator | 2025-03-22 23:17:47 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:47.749819 | orchestrator | 2025-03-22 23:17:47 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:50.813766 | orchestrator | 2025-03-22 23:17:47 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:50.813893 | orchestrator | 2025-03-22 23:17:50 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:50.815078 | orchestrator | 2025-03-22 23:17:50 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:50.816544 | orchestrator | 2025-03-22 23:17:50 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:50.819607 | orchestrator | 2025-03-22 23:17:50 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state STARTED 2025-03-22 23:17:50.820392 | orchestrator | 2025-03-22 23:17:50 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:50.820427 | orchestrator | 2025-03-22 23:17:50 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:50.826149 | orchestrator | 2025-03-22 23:17:50 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:53.893968 | orchestrator | 2025-03-22 23:17:53 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:53.895028 | orchestrator | 2025-03-22 23:17:53 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:53.895076 | orchestrator | 2025-03-22 23:17:53 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:53.896105 | orchestrator | 2025-03-22 23:17:53 | INFO  | Task 929ff10f-3373-4420-97d5-4fd4606c3dc5 is in state SUCCESS 2025-03-22 23:17:53.901728 | orchestrator | 2025-03-22 23:17:53 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:56.977058 | orchestrator | 2025-03-22 23:17:53 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:17:56.977183 | orchestrator | 2025-03-22 23:17:53 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:17:56.977218 | orchestrator | 2025-03-22 23:17:56 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:17:56.980831 | orchestrator | 2025-03-22 23:17:56 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:17:56.986525 | orchestrator | 2025-03-22 23:17:56 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:17:56.989079 | orchestrator | 2025-03-22 23:17:56 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:17:56.989115 | orchestrator | 2025-03-22 23:17:56 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:17:56.990653 | orchestrator | 2025-03-22 23:17:56 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:00.093820 | orchestrator | 2025-03-22 23:17:56 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:00.093941 | orchestrator | 2025-03-22 23:18:00 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:18:00.111273 | orchestrator | 2025-03-22 23:18:00 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:00.113166 | orchestrator | 2025-03-22 23:18:00 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:00.119744 | orchestrator | 2025-03-22 23:18:00 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:00.125646 | orchestrator | 2025-03-22 23:18:00 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:00.127759 | orchestrator | 2025-03-22 23:18:00 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:03.234925 | orchestrator | 2025-03-22 23:18:00 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:03.235049 | orchestrator | 2025-03-22 23:18:03 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:18:03.235124 | orchestrator | 2025-03-22 23:18:03 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:03.236199 | orchestrator | 2025-03-22 23:18:03 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:03.237481 | orchestrator | 2025-03-22 23:18:03 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:03.241725 | orchestrator | 2025-03-22 23:18:03 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:06.330906 | orchestrator | 2025-03-22 23:18:03 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:06.331029 | orchestrator | 2025-03-22 23:18:03 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:06.331067 | orchestrator | 2025-03-22 23:18:06 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:18:06.331154 | orchestrator | 2025-03-22 23:18:06 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:06.332648 | orchestrator | 2025-03-22 23:18:06 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:06.342922 | orchestrator | 2025-03-22 23:18:06 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:06.354720 | orchestrator | 2025-03-22 23:18:06 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:06.354835 | orchestrator | 2025-03-22 23:18:06 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:09.493785 | orchestrator | 2025-03-22 23:18:06 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:09.493938 | orchestrator | 2025-03-22 23:18:09 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:18:09.498300 | orchestrator | 2025-03-22 23:18:09 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:09.498339 | orchestrator | 2025-03-22 23:18:09 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:09.498364 | orchestrator | 2025-03-22 23:18:09 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:09.509003 | orchestrator | 2025-03-22 23:18:09 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:12.612056 | orchestrator | 2025-03-22 23:18:09 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:12.612157 | orchestrator | 2025-03-22 23:18:09 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:12.612189 | orchestrator | 2025-03-22 23:18:12 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:18:12.614550 | orchestrator | 2025-03-22 23:18:12 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:12.614608 | orchestrator | 2025-03-22 23:18:12 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:12.621106 | orchestrator | 2025-03-22 23:18:12 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:12.633936 | orchestrator | 2025-03-22 23:18:12 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:12.636439 | orchestrator | 2025-03-22 23:18:12 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:15.742739 | orchestrator | 2025-03-22 23:18:12 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:15.742872 | orchestrator | 2025-03-22 23:18:15 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state STARTED 2025-03-22 23:18:15.743299 | orchestrator | 2025-03-22 23:18:15 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:15.746765 | orchestrator | 2025-03-22 23:18:15 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:15.747767 | orchestrator | 2025-03-22 23:18:15 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:15.748683 | orchestrator | 2025-03-22 23:18:15 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:15.751923 | orchestrator | 2025-03-22 23:18:15 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:18.874823 | orchestrator | 2025-03-22 23:18:15 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:18.874986 | orchestrator | 2025-03-22 23:18:18 | INFO  | Task f2b84df2-b72e-4224-a5a1-f5b1a81e3fab is in state SUCCESS 2025-03-22 23:18:18.877613 | orchestrator | 2025-03-22 23:18:18 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:18.878178 | orchestrator | 2025-03-22 23:18:18 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:18.888633 | orchestrator | 2025-03-22 23:18:18 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:18.892925 | orchestrator | 2025-03-22 23:18:18 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:18.899972 | orchestrator | 2025-03-22 23:18:18 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:22.009813 | orchestrator | 2025-03-22 23:18:18 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:22.009958 | orchestrator | 2025-03-22 23:18:22 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:22.011436 | orchestrator | 2025-03-22 23:18:22 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:22.013036 | orchestrator | 2025-03-22 23:18:22 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:22.015489 | orchestrator | 2025-03-22 23:18:22 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:22.016240 | orchestrator | 2025-03-22 23:18:22 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:25.104245 | orchestrator | 2025-03-22 23:18:22 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:25.104378 | orchestrator | 2025-03-22 23:18:25 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:25.106372 | orchestrator | 2025-03-22 23:18:25 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:25.107233 | orchestrator | 2025-03-22 23:18:25 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:25.107868 | orchestrator | 2025-03-22 23:18:25 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:25.111425 | orchestrator | 2025-03-22 23:18:25 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:28.188774 | orchestrator | 2025-03-22 23:18:25 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:28.188905 | orchestrator | 2025-03-22 23:18:28 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:28.192063 | orchestrator | 2025-03-22 23:18:28 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:28.192115 | orchestrator | 2025-03-22 23:18:28 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:28.204595 | orchestrator | 2025-03-22 23:18:28 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:28.207113 | orchestrator | 2025-03-22 23:18:28 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:31.272828 | orchestrator | 2025-03-22 23:18:28 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:31.272920 | orchestrator | 2025-03-22 23:18:31 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:31.273903 | orchestrator | 2025-03-22 23:18:31 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:31.274847 | orchestrator | 2025-03-22 23:18:31 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:31.275731 | orchestrator | 2025-03-22 23:18:31 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:31.280071 | orchestrator | 2025-03-22 23:18:31 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:34.364885 | orchestrator | 2025-03-22 23:18:31 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:34.365029 | orchestrator | 2025-03-22 23:18:34 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:34.366278 | orchestrator | 2025-03-22 23:18:34 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:34.366312 | orchestrator | 2025-03-22 23:18:34 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:34.366350 | orchestrator | 2025-03-22 23:18:34 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:34.369420 | orchestrator | 2025-03-22 23:18:34 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:34.370346 | orchestrator | 2025-03-22 23:18:34 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:37.454694 | orchestrator | 2025-03-22 23:18:37 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:37.458989 | orchestrator | 2025-03-22 23:18:37 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:37.461123 | orchestrator | 2025-03-22 23:18:37 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state STARTED 2025-03-22 23:18:37.473468 | orchestrator | 2025-03-22 23:18:37 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:40.544243 | orchestrator | 2025-03-22 23:18:37 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:40.544346 | orchestrator | 2025-03-22 23:18:37 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:40.544365 | orchestrator | 2025-03-22 23:18:40 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:40.548829 | orchestrator | 2025-03-22 23:18:40.548842 | orchestrator | 2025-03-22 23:18:40.548848 | orchestrator | PLAY [Apply role homer] ******************************************************** 2025-03-22 23:18:40.548853 | orchestrator | 2025-03-22 23:18:40.548858 | orchestrator | TASK [osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards] *** 2025-03-22 23:18:40.548864 | orchestrator | Saturday 22 March 2025 23:17:09 +0000 (0:00:00.873) 0:00:00.873 ******** 2025-03-22 23:18:40.548869 | orchestrator | ok: [testbed-manager] => { 2025-03-22 23:18:40.548876 | orchestrator |  "msg": "The support for the homer_url_kibana has been removed. Please use the homer_url_opensearch_dashboards parameter." 2025-03-22 23:18:40.548882 | orchestrator | } 2025-03-22 23:18:40.548887 | orchestrator | 2025-03-22 23:18:40.548892 | orchestrator | TASK [osism.services.homer : Create traefik external network] ****************** 2025-03-22 23:18:40.548897 | orchestrator | Saturday 22 March 2025 23:17:10 +0000 (0:00:00.346) 0:00:01.219 ******** 2025-03-22 23:18:40.548902 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:40.548908 | orchestrator | 2025-03-22 23:18:40.548913 | orchestrator | TASK [osism.services.homer : Create required directories] ********************** 2025-03-22 23:18:40.548919 | orchestrator | Saturday 22 March 2025 23:17:12 +0000 (0:00:01.997) 0:00:03.217 ******** 2025-03-22 23:18:40.548924 | orchestrator | changed: [testbed-manager] => (item=/opt/homer/configuration) 2025-03-22 23:18:40.548929 | orchestrator | ok: [testbed-manager] => (item=/opt/homer) 2025-03-22 23:18:40.548934 | orchestrator | 2025-03-22 23:18:40.548939 | orchestrator | TASK [osism.services.homer : Copy config.yml configuration file] *************** 2025-03-22 23:18:40.548944 | orchestrator | Saturday 22 March 2025 23:17:13 +0000 (0:00:01.601) 0:00:04.818 ******** 2025-03-22 23:18:40.548949 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.548954 | orchestrator | 2025-03-22 23:18:40.548959 | orchestrator | TASK [osism.services.homer : Copy docker-compose.yml file] ********************* 2025-03-22 23:18:40.548964 | orchestrator | Saturday 22 March 2025 23:17:18 +0000 (0:00:04.358) 0:00:09.177 ******** 2025-03-22 23:18:40.548969 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.548974 | orchestrator | 2025-03-22 23:18:40.548979 | orchestrator | TASK [osism.services.homer : Manage homer service] ***************************** 2025-03-22 23:18:40.548984 | orchestrator | Saturday 22 March 2025 23:17:20 +0000 (0:00:02.645) 0:00:11.822 ******** 2025-03-22 23:18:40.548988 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage homer service (10 retries left). 2025-03-22 23:18:40.548993 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:40.548998 | orchestrator | 2025-03-22 23:18:40.549003 | orchestrator | RUNNING HANDLER [osism.services.homer : Restart homer service] ***************** 2025-03-22 23:18:40.549008 | orchestrator | Saturday 22 March 2025 23:17:49 +0000 (0:00:28.936) 0:00:40.759 ******** 2025-03-22 23:18:40.549013 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.549019 | orchestrator | 2025-03-22 23:18:40.549024 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:18:40.549029 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:40.549036 | orchestrator | 2025-03-22 23:18:40.549041 | orchestrator | Saturday 22 March 2025 23:17:52 +0000 (0:00:02.889) 0:00:43.649 ******** 2025-03-22 23:18:40.549046 | orchestrator | =============================================================================== 2025-03-22 23:18:40.549050 | orchestrator | osism.services.homer : Manage homer service ---------------------------- 28.94s 2025-03-22 23:18:40.549055 | orchestrator | osism.services.homer : Copy config.yml configuration file --------------- 4.36s 2025-03-22 23:18:40.549060 | orchestrator | osism.services.homer : Restart homer service ---------------------------- 2.89s 2025-03-22 23:18:40.549065 | orchestrator | osism.services.homer : Copy docker-compose.yml file --------------------- 2.65s 2025-03-22 23:18:40.549070 | orchestrator | osism.services.homer : Create traefik external network ------------------ 2.00s 2025-03-22 23:18:40.549079 | orchestrator | osism.services.homer : Create required directories ---------------------- 1.60s 2025-03-22 23:18:40.549084 | orchestrator | osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards --- 0.35s 2025-03-22 23:18:40.549096 | orchestrator | 2025-03-22 23:18:40.549101 | orchestrator | 2025-03-22 23:18:40.549106 | orchestrator | PLAY [Apply role openstackclient] ********************************************** 2025-03-22 23:18:40.549110 | orchestrator | 2025-03-22 23:18:40.549115 | orchestrator | TASK [osism.services.openstackclient : Include tasks] ************************** 2025-03-22 23:18:40.549120 | orchestrator | Saturday 22 March 2025 23:17:10 +0000 (0:00:00.462) 0:00:00.462 ******** 2025-03-22 23:18:40.549125 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/openstackclient/tasks/container-Debian-family.yml for testbed-manager 2025-03-22 23:18:40.549131 | orchestrator | 2025-03-22 23:18:40.549135 | orchestrator | TASK [osism.services.openstackclient : Create required directories] ************ 2025-03-22 23:18:40.549140 | orchestrator | Saturday 22 March 2025 23:17:11 +0000 (0:00:00.534) 0:00:00.997 ******** 2025-03-22 23:18:40.549145 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/openstack) 2025-03-22 23:18:40.549150 | orchestrator | changed: [testbed-manager] => (item=/opt/openstackclient/data) 2025-03-22 23:18:40.549155 | orchestrator | ok: [testbed-manager] => (item=/opt/openstackclient) 2025-03-22 23:18:40.549160 | orchestrator | 2025-03-22 23:18:40.549164 | orchestrator | TASK [osism.services.openstackclient : Copy docker-compose.yml file] *********** 2025-03-22 23:18:40.549169 | orchestrator | Saturday 22 March 2025 23:17:13 +0000 (0:00:02.456) 0:00:03.453 ******** 2025-03-22 23:18:40.549174 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.549179 | orchestrator | 2025-03-22 23:18:40.549184 | orchestrator | TASK [osism.services.openstackclient : Manage openstackclient service] ********* 2025-03-22 23:18:40.549189 | orchestrator | Saturday 22 March 2025 23:17:16 +0000 (0:00:02.659) 0:00:06.113 ******** 2025-03-22 23:18:40.549194 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage openstackclient service (10 retries left). 2025-03-22 23:18:40.549199 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:40.549204 | orchestrator | 2025-03-22 23:18:40.549213 | orchestrator | TASK [osism.services.openstackclient : Copy openstack wrapper script] ********** 2025-03-22 23:18:40.549225 | orchestrator | Saturday 22 March 2025 23:17:58 +0000 (0:00:42.455) 0:00:48.568 ******** 2025-03-22 23:18:40.549231 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.549235 | orchestrator | 2025-03-22 23:18:40.549240 | orchestrator | TASK [osism.services.openstackclient : Remove ospurge wrapper script] ********** 2025-03-22 23:18:40.549245 | orchestrator | Saturday 22 March 2025 23:18:00 +0000 (0:00:02.218) 0:00:50.787 ******** 2025-03-22 23:18:40.549250 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:40.549255 | orchestrator | 2025-03-22 23:18:40.549260 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Restart openstackclient service] *** 2025-03-22 23:18:40.549265 | orchestrator | Saturday 22 March 2025 23:18:02 +0000 (0:00:01.381) 0:00:52.169 ******** 2025-03-22 23:18:40.549270 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.549274 | orchestrator | 2025-03-22 23:18:40.549279 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Ensure that all containers are up] *** 2025-03-22 23:18:40.549284 | orchestrator | Saturday 22 March 2025 23:18:05 +0000 (0:00:03.650) 0:00:55.819 ******** 2025-03-22 23:18:40.549289 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.549294 | orchestrator | 2025-03-22 23:18:40.549299 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Wait for an healthy service] *** 2025-03-22 23:18:40.549304 | orchestrator | Saturday 22 March 2025 23:18:08 +0000 (0:00:02.680) 0:00:58.500 ******** 2025-03-22 23:18:40.549308 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:40.549313 | orchestrator | 2025-03-22 23:18:40.549318 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Copy bash completion script] *** 2025-03-22 23:18:40.549323 | orchestrator | Saturday 22 March 2025 23:18:10 +0000 (0:00:02.111) 0:01:00.612 ******** 2025-03-22 23:18:40.549328 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:40.549332 | orchestrator | 2025-03-22 23:18:40.549337 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:18:40.549342 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:40.549350 | orchestrator | 2025-03-22 23:18:40.549355 | orchestrator | Saturday 22 March 2025 23:18:12 +0000 (0:00:01.274) 0:01:01.886 ******** 2025-03-22 23:18:40.549360 | orchestrator | =============================================================================== 2025-03-22 23:18:40.549365 | orchestrator | osism.services.openstackclient : Manage openstackclient service -------- 42.46s 2025-03-22 23:18:40.549369 | orchestrator | osism.services.openstackclient : Restart openstackclient service -------- 3.65s 2025-03-22 23:18:40.549374 | orchestrator | osism.services.openstackclient : Ensure that all containers are up ------ 2.68s 2025-03-22 23:18:40.549379 | orchestrator | osism.services.openstackclient : Copy docker-compose.yml file ----------- 2.66s 2025-03-22 23:18:40.549384 | orchestrator | osism.services.openstackclient : Create required directories ------------ 2.45s 2025-03-22 23:18:40.549391 | orchestrator | osism.services.openstackclient : Copy openstack wrapper script ---------- 2.22s 2025-03-22 23:18:40.549396 | orchestrator | osism.services.openstackclient : Wait for an healthy service ------------ 2.11s 2025-03-22 23:18:40.549401 | orchestrator | osism.services.openstackclient : Remove ospurge wrapper script ---------- 1.38s 2025-03-22 23:18:40.549405 | orchestrator | osism.services.openstackclient : Copy bash completion script ------------ 1.27s 2025-03-22 23:18:40.549410 | orchestrator | osism.services.openstackclient : Include tasks -------------------------- 0.54s 2025-03-22 23:18:40.549415 | orchestrator | 2025-03-22 23:18:40.549421 | orchestrator | 2025-03-22 23:18:40 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:40.550113 | orchestrator | 2025-03-22 23:18:40 | INFO  | Task 6dfc7afe-1f4f-4147-9db7-b08f973ffa2f is in state SUCCESS 2025-03-22 23:18:40.550124 | orchestrator | 2025-03-22 23:18:40 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:40.555736 | orchestrator | 2025-03-22 23:18:40 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:43.626791 | orchestrator | 2025-03-22 23:18:40 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:43.626918 | orchestrator | 2025-03-22 23:18:43 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:43.631284 | orchestrator | 2025-03-22 23:18:43 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:43.639580 | orchestrator | 2025-03-22 23:18:43 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:43.641109 | orchestrator | 2025-03-22 23:18:43 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:43.641260 | orchestrator | 2025-03-22 23:18:43 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:46.692712 | orchestrator | 2025-03-22 23:18:46 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:46.696404 | orchestrator | 2025-03-22 23:18:46 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:46.702185 | orchestrator | 2025-03-22 23:18:46 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:46.706810 | orchestrator | 2025-03-22 23:18:46 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state STARTED 2025-03-22 23:18:49.788242 | orchestrator | 2025-03-22 23:18:46 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:49.788384 | orchestrator | 2025-03-22 23:18:49 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:49.789659 | orchestrator | 2025-03-22 23:18:49 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:49.792530 | orchestrator | 2025-03-22 23:18:49 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:49.793729 | orchestrator | 2025-03-22 23:18:49.793765 | orchestrator | PLAY [Apply role phpmyadmin] *************************************************** 2025-03-22 23:18:49.793781 | orchestrator | 2025-03-22 23:18:49.793795 | orchestrator | TASK [osism.services.phpmyadmin : Create traefik external network] ************* 2025-03-22 23:18:49.793810 | orchestrator | Saturday 22 March 2025 23:17:36 +0000 (0:00:00.779) 0:00:00.779 ******** 2025-03-22 23:18:49.793823 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:49.793839 | orchestrator | 2025-03-22 23:18:49.793854 | orchestrator | TASK [osism.services.phpmyadmin : Create required directories] ***************** 2025-03-22 23:18:49.793868 | orchestrator | Saturday 22 March 2025 23:17:38 +0000 (0:00:01.259) 0:00:02.039 ******** 2025-03-22 23:18:49.793883 | orchestrator | changed: [testbed-manager] => (item=/opt/phpmyadmin) 2025-03-22 23:18:49.793897 | orchestrator | 2025-03-22 23:18:49.793911 | orchestrator | TASK [osism.services.phpmyadmin : Copy docker-compose.yml file] **************** 2025-03-22 23:18:49.793925 | orchestrator | Saturday 22 March 2025 23:17:39 +0000 (0:00:01.042) 0:00:03.081 ******** 2025-03-22 23:18:49.793938 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.793953 | orchestrator | 2025-03-22 23:18:49.793967 | orchestrator | TASK [osism.services.phpmyadmin : Manage phpmyadmin service] ******************* 2025-03-22 23:18:49.793980 | orchestrator | Saturday 22 March 2025 23:17:42 +0000 (0:00:03.309) 0:00:06.391 ******** 2025-03-22 23:18:49.793994 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage phpmyadmin service (10 retries left). 2025-03-22 23:18:49.794009 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:49.794071 | orchestrator | 2025-03-22 23:18:49.794087 | orchestrator | RUNNING HANDLER [osism.services.phpmyadmin : Restart phpmyadmin service] ******* 2025-03-22 23:18:49.794101 | orchestrator | Saturday 22 March 2025 23:18:34 +0000 (0:00:52.474) 0:00:58.866 ******** 2025-03-22 23:18:49.794115 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.794130 | orchestrator | 2025-03-22 23:18:49.794144 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:18:49.794158 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.794174 | orchestrator | 2025-03-22 23:18:49.794188 | orchestrator | Saturday 22 March 2025 23:18:39 +0000 (0:00:04.256) 0:01:03.122 ******** 2025-03-22 23:18:49.794202 | orchestrator | =============================================================================== 2025-03-22 23:18:49.794216 | orchestrator | osism.services.phpmyadmin : Manage phpmyadmin service ------------------ 52.47s 2025-03-22 23:18:49.794230 | orchestrator | osism.services.phpmyadmin : Restart phpmyadmin service ------------------ 4.26s 2025-03-22 23:18:49.794244 | orchestrator | osism.services.phpmyadmin : Copy docker-compose.yml file ---------------- 3.31s 2025-03-22 23:18:49.794259 | orchestrator | osism.services.phpmyadmin : Create traefik external network ------------- 1.26s 2025-03-22 23:18:49.794273 | orchestrator | osism.services.phpmyadmin : Create required directories ----------------- 1.04s 2025-03-22 23:18:49.794287 | orchestrator | 2025-03-22 23:18:49.794301 | orchestrator | 2025-03-22 23:18:49.794315 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 23:18:49.794330 | orchestrator | 2025-03-22 23:18:49.794347 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-22 23:18:49.794362 | orchestrator | Saturday 22 March 2025 23:17:09 +0000 (0:00:00.479) 0:00:00.479 ******** 2025-03-22 23:18:49.794377 | orchestrator | changed: [testbed-manager] => (item=enable_netdata_True) 2025-03-22 23:18:49.794392 | orchestrator | changed: [testbed-node-0] => (item=enable_netdata_True) 2025-03-22 23:18:49.794408 | orchestrator | changed: [testbed-node-1] => (item=enable_netdata_True) 2025-03-22 23:18:49.794424 | orchestrator | changed: [testbed-node-2] => (item=enable_netdata_True) 2025-03-22 23:18:49.794439 | orchestrator | changed: [testbed-node-3] => (item=enable_netdata_True) 2025-03-22 23:18:49.794454 | orchestrator | changed: [testbed-node-4] => (item=enable_netdata_True) 2025-03-22 23:18:49.794469 | orchestrator | changed: [testbed-node-5] => (item=enable_netdata_True) 2025-03-22 23:18:49.794531 | orchestrator | 2025-03-22 23:18:49.794549 | orchestrator | PLAY [Apply role netdata] ****************************************************** 2025-03-22 23:18:49.794564 | orchestrator | 2025-03-22 23:18:49.794579 | orchestrator | TASK [osism.services.netdata : Include distribution specific install tasks] **** 2025-03-22 23:18:49.794595 | orchestrator | Saturday 22 March 2025 23:17:12 +0000 (0:00:02.445) 0:00:02.924 ******** 2025-03-22 23:18:49.794628 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 23:18:49.794647 | orchestrator | 2025-03-22 23:18:49.794669 | orchestrator | TASK [osism.services.netdata : Remove old architecture-dependent repository] *** 2025-03-22 23:18:49.794686 | orchestrator | Saturday 22 March 2025 23:17:15 +0000 (0:00:03.329) 0:00:06.254 ******** 2025-03-22 23:18:49.794701 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:18:49.794716 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:18:49.794730 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:49.794744 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:18:49.794758 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:18:49.794772 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:18:49.794786 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:18:49.794800 | orchestrator | 2025-03-22 23:18:49.794814 | orchestrator | TASK [osism.services.netdata : Install apt-transport-https package] ************ 2025-03-22 23:18:49.794828 | orchestrator | Saturday 22 March 2025 23:17:20 +0000 (0:00:05.103) 0:00:11.358 ******** 2025-03-22 23:18:49.794842 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:18:49.794856 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:49.794870 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:18:49.794884 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:18:49.794898 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:18:49.794911 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:18:49.794931 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:18:49.794945 | orchestrator | 2025-03-22 23:18:49.794960 | orchestrator | TASK [osism.services.netdata : Add repository gpg key] ************************* 2025-03-22 23:18:49.794984 | orchestrator | Saturday 22 March 2025 23:17:26 +0000 (0:00:06.028) 0:00:17.387 ******** 2025-03-22 23:18:49.795000 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.795015 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:18:49.795029 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:18:49.795043 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:18:49.795057 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:18:49.795071 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:18:49.795085 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:18:49.795099 | orchestrator | 2025-03-22 23:18:49.795113 | orchestrator | TASK [osism.services.netdata : Add repository] ********************************* 2025-03-22 23:18:49.795127 | orchestrator | Saturday 22 March 2025 23:17:29 +0000 (0:00:02.659) 0:00:20.046 ******** 2025-03-22 23:18:49.795142 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:18:49.795156 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:18:49.795170 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:18:49.795184 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:18:49.795198 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:18:49.795212 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:18:49.795225 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.795239 | orchestrator | 2025-03-22 23:18:49.795253 | orchestrator | TASK [osism.services.netdata : Install package netdata] ************************ 2025-03-22 23:18:49.795267 | orchestrator | Saturday 22 March 2025 23:17:39 +0000 (0:00:10.269) 0:00:30.315 ******** 2025-03-22 23:18:49.795281 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:18:49.795295 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:18:49.795309 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:18:49.795323 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:18:49.795337 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:18:49.795351 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:18:49.795374 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.795388 | orchestrator | 2025-03-22 23:18:49.795402 | orchestrator | TASK [osism.services.netdata : Include config tasks] *************************** 2025-03-22 23:18:49.795416 | orchestrator | Saturday 22 March 2025 23:18:01 +0000 (0:00:21.885) 0:00:52.201 ******** 2025-03-22 23:18:49.795431 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 23:18:49.795450 | orchestrator | 2025-03-22 23:18:49.795465 | orchestrator | TASK [osism.services.netdata : Copy configuration files] *********************** 2025-03-22 23:18:49.795479 | orchestrator | Saturday 22 March 2025 23:18:06 +0000 (0:00:04.974) 0:00:57.176 ******** 2025-03-22 23:18:49.795493 | orchestrator | changed: [testbed-node-2] => (item=netdata.conf) 2025-03-22 23:18:49.795528 | orchestrator | changed: [testbed-manager] => (item=netdata.conf) 2025-03-22 23:18:49.795543 | orchestrator | changed: [testbed-node-0] => (item=netdata.conf) 2025-03-22 23:18:49.795557 | orchestrator | changed: [testbed-node-1] => (item=netdata.conf) 2025-03-22 23:18:49.795571 | orchestrator | changed: [testbed-node-4] => (item=netdata.conf) 2025-03-22 23:18:49.795585 | orchestrator | changed: [testbed-node-3] => (item=netdata.conf) 2025-03-22 23:18:49.795599 | orchestrator | changed: [testbed-node-5] => (item=netdata.conf) 2025-03-22 23:18:49.795613 | orchestrator | changed: [testbed-node-4] => (item=stream.conf) 2025-03-22 23:18:49.795627 | orchestrator | changed: [testbed-node-0] => (item=stream.conf) 2025-03-22 23:18:49.795641 | orchestrator | changed: [testbed-node-3] => (item=stream.conf) 2025-03-22 23:18:49.795655 | orchestrator | changed: [testbed-manager] => (item=stream.conf) 2025-03-22 23:18:49.795669 | orchestrator | changed: [testbed-node-2] => (item=stream.conf) 2025-03-22 23:18:49.795683 | orchestrator | changed: [testbed-node-1] => (item=stream.conf) 2025-03-22 23:18:49.795697 | orchestrator | changed: [testbed-node-5] => (item=stream.conf) 2025-03-22 23:18:49.795711 | orchestrator | 2025-03-22 23:18:49.795725 | orchestrator | TASK [osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status] *** 2025-03-22 23:18:49.795740 | orchestrator | Saturday 22 March 2025 23:18:22 +0000 (0:00:15.874) 0:01:13.050 ******** 2025-03-22 23:18:49.795754 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:49.795768 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:18:49.795783 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:18:49.795797 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:18:49.795811 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:18:49.795825 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:18:49.795838 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:18:49.795852 | orchestrator | 2025-03-22 23:18:49.795866 | orchestrator | TASK [osism.services.netdata : Opt out from anonymous statistics] ************** 2025-03-22 23:18:49.795880 | orchestrator | Saturday 22 March 2025 23:18:24 +0000 (0:00:02.599) 0:01:15.650 ******** 2025-03-22 23:18:49.795895 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:18:49.795909 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.795923 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:18:49.795937 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:18:49.795951 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:18:49.795965 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:18:49.795979 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:18:49.795993 | orchestrator | 2025-03-22 23:18:49.796007 | orchestrator | TASK [osism.services.netdata : Add netdata user to docker group] *************** 2025-03-22 23:18:49.796021 | orchestrator | Saturday 22 March 2025 23:18:27 +0000 (0:00:02.632) 0:01:18.283 ******** 2025-03-22 23:18:49.796035 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:18:49.796049 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:18:49.796063 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:49.796077 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:18:49.796091 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:18:49.796105 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:18:49.796126 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:18:49.796140 | orchestrator | 2025-03-22 23:18:49.796154 | orchestrator | TASK [osism.services.netdata : Manage service netdata] ************************* 2025-03-22 23:18:49.796168 | orchestrator | Saturday 22 March 2025 23:18:30 +0000 (0:00:02.648) 0:01:20.931 ******** 2025-03-22 23:18:49.796182 | orchestrator | ok: [testbed-manager] 2025-03-22 23:18:49.796197 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:18:49.796211 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:18:49.796225 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:18:49.796238 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:18:49.796258 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:18:49.796294 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:18:49.796309 | orchestrator | 2025-03-22 23:18:49.796323 | orchestrator | TASK [osism.services.netdata : Include host type specific tasks] *************** 2025-03-22 23:18:49.796338 | orchestrator | Saturday 22 March 2025 23:18:35 +0000 (0:00:05.127) 0:01:26.059 ******** 2025-03-22 23:18:49.796352 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/server.yml for testbed-manager 2025-03-22 23:18:49.796369 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/client.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 23:18:49.796383 | orchestrator | 2025-03-22 23:18:49.796398 | orchestrator | TASK [osism.services.netdata : Set sysctl vm.max_map_count parameter] ********** 2025-03-22 23:18:49.796411 | orchestrator | Saturday 22 March 2025 23:18:38 +0000 (0:00:03.565) 0:01:29.625 ******** 2025-03-22 23:18:49.796425 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.796440 | orchestrator | 2025-03-22 23:18:49.796454 | orchestrator | RUNNING HANDLER [osism.services.netdata : Restart service netdata] ************* 2025-03-22 23:18:49.796468 | orchestrator | Saturday 22 March 2025 23:18:42 +0000 (0:00:03.970) 0:01:33.595 ******** 2025-03-22 23:18:49.796483 | orchestrator | changed: [testbed-manager] 2025-03-22 23:18:49.796555 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:18:49.796573 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:18:49.796588 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:18:49.796602 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:18:49.796616 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:18:49.796630 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:18:49.796643 | orchestrator | 2025-03-22 23:18:49.796657 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:18:49.796672 | orchestrator | testbed-manager : ok=16  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.796687 | orchestrator | testbed-node-0 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.796707 | orchestrator | testbed-node-1 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.796721 | orchestrator | testbed-node-2 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.796736 | orchestrator | testbed-node-3 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.796750 | orchestrator | testbed-node-4 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.796763 | orchestrator | testbed-node-5 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:18:49.796778 | orchestrator | 2025-03-22 23:18:49.796792 | orchestrator | Saturday 22 March 2025 23:18:46 +0000 (0:00:03.655) 0:01:37.250 ******** 2025-03-22 23:18:49.796806 | orchestrator | =============================================================================== 2025-03-22 23:18:49.796846 | orchestrator | osism.services.netdata : Install package netdata ----------------------- 21.89s 2025-03-22 23:18:49.796861 | orchestrator | osism.services.netdata : Copy configuration files ---------------------- 15.87s 2025-03-22 23:18:49.796879 | orchestrator | osism.services.netdata : Add repository -------------------------------- 10.27s 2025-03-22 23:18:49.796894 | orchestrator | osism.services.netdata : Install apt-transport-https package ------------ 6.03s 2025-03-22 23:18:49.796908 | orchestrator | osism.services.netdata : Manage service netdata ------------------------- 5.13s 2025-03-22 23:18:49.796922 | orchestrator | osism.services.netdata : Remove old architecture-dependent repository --- 5.10s 2025-03-22 23:18:49.796936 | orchestrator | osism.services.netdata : Include config tasks --------------------------- 4.98s 2025-03-22 23:18:49.796950 | orchestrator | osism.services.netdata : Set sysctl vm.max_map_count parameter ---------- 3.97s 2025-03-22 23:18:49.796964 | orchestrator | osism.services.netdata : Restart service netdata ------------------------ 3.66s 2025-03-22 23:18:49.796978 | orchestrator | osism.services.netdata : Include host type specific tasks --------------- 3.57s 2025-03-22 23:18:49.796992 | orchestrator | osism.services.netdata : Include distribution specific install tasks ---- 3.33s 2025-03-22 23:18:49.797006 | orchestrator | osism.services.netdata : Add repository gpg key ------------------------- 2.66s 2025-03-22 23:18:49.797020 | orchestrator | osism.services.netdata : Add netdata user to docker group --------------- 2.65s 2025-03-22 23:18:49.797032 | orchestrator | osism.services.netdata : Opt out from anonymous statistics -------------- 2.63s 2025-03-22 23:18:49.797053 | orchestrator | osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status --- 2.60s 2025-03-22 23:18:49.797066 | orchestrator | Group hosts based on enabled services ----------------------------------- 2.45s 2025-03-22 23:18:49.797083 | orchestrator | 2025-03-22 23:18:49 | INFO  | Task 07ace64e-b7c6-427d-a306-34efcbf75476 is in state SUCCESS 2025-03-22 23:18:52.852851 | orchestrator | 2025-03-22 23:18:49 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:52.852987 | orchestrator | 2025-03-22 23:18:52 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:52.855182 | orchestrator | 2025-03-22 23:18:52 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:52.855702 | orchestrator | 2025-03-22 23:18:52 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:52.858382 | orchestrator | 2025-03-22 23:18:52 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:55.933496 | orchestrator | 2025-03-22 23:18:55 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:55.934851 | orchestrator | 2025-03-22 23:18:55 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:55.935755 | orchestrator | 2025-03-22 23:18:55 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:18:58.985578 | orchestrator | 2025-03-22 23:18:55 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:18:58.985725 | orchestrator | 2025-03-22 23:18:58 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:18:58.987670 | orchestrator | 2025-03-22 23:18:58 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:18:58.990475 | orchestrator | 2025-03-22 23:18:58 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:02.062889 | orchestrator | 2025-03-22 23:18:58 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:02.063012 | orchestrator | 2025-03-22 23:19:02 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:02.065782 | orchestrator | 2025-03-22 23:19:02 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:02.069647 | orchestrator | 2025-03-22 23:19:02 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:05.160268 | orchestrator | 2025-03-22 23:19:02 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:05.160403 | orchestrator | 2025-03-22 23:19:05 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:05.162242 | orchestrator | 2025-03-22 23:19:05 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:05.162278 | orchestrator | 2025-03-22 23:19:05 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:08.203650 | orchestrator | 2025-03-22 23:19:05 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:08.203785 | orchestrator | 2025-03-22 23:19:08 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:08.205770 | orchestrator | 2025-03-22 23:19:08 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:08.206463 | orchestrator | 2025-03-22 23:19:08 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:11.259360 | orchestrator | 2025-03-22 23:19:08 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:11.259487 | orchestrator | 2025-03-22 23:19:11 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:11.261435 | orchestrator | 2025-03-22 23:19:11 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:11.262131 | orchestrator | 2025-03-22 23:19:11 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:14.323843 | orchestrator | 2025-03-22 23:19:11 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:14.323958 | orchestrator | 2025-03-22 23:19:14 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:14.326551 | orchestrator | 2025-03-22 23:19:14 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:14.328856 | orchestrator | 2025-03-22 23:19:14 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:17.374298 | orchestrator | 2025-03-22 23:19:14 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:17.374438 | orchestrator | 2025-03-22 23:19:17 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:17.375565 | orchestrator | 2025-03-22 23:19:17 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:17.377247 | orchestrator | 2025-03-22 23:19:17 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:20.438247 | orchestrator | 2025-03-22 23:19:17 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:20.438405 | orchestrator | 2025-03-22 23:19:20 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:20.440563 | orchestrator | 2025-03-22 23:19:20 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:20.442115 | orchestrator | 2025-03-22 23:19:20 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:20.442674 | orchestrator | 2025-03-22 23:19:20 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:23.502881 | orchestrator | 2025-03-22 23:19:23 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:23.509619 | orchestrator | 2025-03-22 23:19:23 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:23.518604 | orchestrator | 2025-03-22 23:19:23 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:23.520006 | orchestrator | 2025-03-22 23:19:23 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:26.581331 | orchestrator | 2025-03-22 23:19:26 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:29.645820 | orchestrator | 2025-03-22 23:19:26 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:29.645947 | orchestrator | 2025-03-22 23:19:26 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:29.645968 | orchestrator | 2025-03-22 23:19:26 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:29.646001 | orchestrator | 2025-03-22 23:19:29 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:29.648059 | orchestrator | 2025-03-22 23:19:29 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:29.649703 | orchestrator | 2025-03-22 23:19:29 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:32.686125 | orchestrator | 2025-03-22 23:19:29 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:32.686262 | orchestrator | 2025-03-22 23:19:32 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:32.686746 | orchestrator | 2025-03-22 23:19:32 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:32.687768 | orchestrator | 2025-03-22 23:19:32 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:32.687922 | orchestrator | 2025-03-22 23:19:32 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:35.729319 | orchestrator | 2025-03-22 23:19:35 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:35.731623 | orchestrator | 2025-03-22 23:19:35 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:35.732089 | orchestrator | 2025-03-22 23:19:35 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:38.779006 | orchestrator | 2025-03-22 23:19:35 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:38.779135 | orchestrator | 2025-03-22 23:19:38 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:38.780498 | orchestrator | 2025-03-22 23:19:38 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:38.781942 | orchestrator | 2025-03-22 23:19:38 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:41.831870 | orchestrator | 2025-03-22 23:19:38 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:41.832021 | orchestrator | 2025-03-22 23:19:41 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:41.835400 | orchestrator | 2025-03-22 23:19:41 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:44.883050 | orchestrator | 2025-03-22 23:19:41 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:44.883155 | orchestrator | 2025-03-22 23:19:41 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:44.883188 | orchestrator | 2025-03-22 23:19:44 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:47.928162 | orchestrator | 2025-03-22 23:19:44 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:47.928282 | orchestrator | 2025-03-22 23:19:44 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:47.928302 | orchestrator | 2025-03-22 23:19:44 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:47.928358 | orchestrator | 2025-03-22 23:19:47 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:47.929404 | orchestrator | 2025-03-22 23:19:47 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:51.010449 | orchestrator | 2025-03-22 23:19:47 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:51.010613 | orchestrator | 2025-03-22 23:19:47 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:51.010651 | orchestrator | 2025-03-22 23:19:51 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:51.013497 | orchestrator | 2025-03-22 23:19:51 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:51.017770 | orchestrator | 2025-03-22 23:19:51 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:54.081828 | orchestrator | 2025-03-22 23:19:51 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:54.081971 | orchestrator | 2025-03-22 23:19:54 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:54.083552 | orchestrator | 2025-03-22 23:19:54 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:54.083591 | orchestrator | 2025-03-22 23:19:54 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:57.152056 | orchestrator | 2025-03-22 23:19:54 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:19:57.152187 | orchestrator | 2025-03-22 23:19:57 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:19:57.152929 | orchestrator | 2025-03-22 23:19:57 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state STARTED 2025-03-22 23:19:57.153533 | orchestrator | 2025-03-22 23:19:57 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:19:57.153900 | orchestrator | 2025-03-22 23:19:57 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:00.217210 | orchestrator | 2025-03-22 23:20:00 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:00.222106 | orchestrator | 2025-03-22 23:20:00 | INFO  | Task d61fa032-5091-449f-ae29-c37dc43d538b is in state SUCCESS 2025-03-22 23:20:00.224603 | orchestrator | 2025-03-22 23:20:00.224657 | orchestrator | 2025-03-22 23:20:00.224673 | orchestrator | PLAY [Apply role common] ******************************************************* 2025-03-22 23:20:00.224688 | orchestrator | 2025-03-22 23:20:00.224704 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-03-22 23:20:00.224718 | orchestrator | Saturday 22 March 2025 23:17:03 +0000 (0:00:00.396) 0:00:00.396 ******** 2025-03-22 23:20:00.224733 | orchestrator | included: /ansible/roles/common/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 23:20:00.224749 | orchestrator | 2025-03-22 23:20:00.224763 | orchestrator | TASK [common : Ensuring config directories exist] ****************************** 2025-03-22 23:20:00.224777 | orchestrator | Saturday 22 March 2025 23:17:05 +0000 (0:00:02.081) 0:00:02.478 ******** 2025-03-22 23:20:00.224791 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-22 23:20:00.224804 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-22 23:20:00.224819 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-22 23:20:00.224833 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-22 23:20:00.224855 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-22 23:20:00.224878 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-22 23:20:00.224929 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-22 23:20:00.224955 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-22 23:20:00.224975 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-22 23:20:00.224990 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-22 23:20:00.225004 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-22 23:20:00.225018 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-22 23:20:00.225032 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-22 23:20:00.225047 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'cron'}, 'cron']) 2025-03-22 23:20:00.225063 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-22 23:20:00.225079 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-22 23:20:00.225096 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-22 23:20:00.225119 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-03-22 23:20:00.225142 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-22 23:20:00.225165 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-22 23:20:00.225189 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-03-22 23:20:00.225212 | orchestrator | 2025-03-22 23:20:00.225245 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-03-22 23:20:00.225270 | orchestrator | Saturday 22 March 2025 23:17:11 +0000 (0:00:05.608) 0:00:08.086 ******** 2025-03-22 23:20:00.225294 | orchestrator | included: /ansible/roles/common/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 23:20:00.225327 | orchestrator | 2025-03-22 23:20:00.225352 | orchestrator | TASK [service-cert-copy : common | Copying over extra CA certificates] ********* 2025-03-22 23:20:00.225368 | orchestrator | Saturday 22 March 2025 23:17:14 +0000 (0:00:02.596) 0:00:10.683 ******** 2025-03-22 23:20:00.225390 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.225411 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.225440 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.225468 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.225483 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.225498 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.225535 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.225551 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225566 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225588 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225612 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225694 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225711 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225726 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225745 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225762 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225908 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225935 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225950 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225964 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225979 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.225993 | orchestrator | 2025-03-22 23:20:00.226007 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS certificate] *** 2025-03-22 23:20:00.226083 | orchestrator | Saturday 22 March 2025 23:17:21 +0000 (0:00:07.345) 0:00:18.028 ******** 2025-03-22 23:20:00.226101 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226116 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226138 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226153 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:20:00.226185 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226200 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226215 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226229 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:20:00.226244 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226259 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226273 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226287 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:20:00.226302 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226324 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226355 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226371 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:20:00.226386 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226400 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226415 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226429 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:20:00.226443 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226458 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226479 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226494 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:20:00.226535 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226552 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226567 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226581 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:20:00.226595 | orchestrator | 2025-03-22 23:20:00.226609 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS key] ****** 2025-03-22 23:20:00.226623 | orchestrator | Saturday 22 March 2025 23:17:25 +0000 (0:00:03.926) 0:00:21.955 ******** 2025-03-22 23:20:00.226638 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226652 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226667 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226688 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226709 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226729 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226744 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:20:00.226758 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:20:00.226772 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226789 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226803 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226818 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226838 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226853 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226867 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:20:00.226890 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226910 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226925 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.226940 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:20:00.226954 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:20:00.226968 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.226983 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.227009 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.227023 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:20:00.227038 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-03-22 23:20:00.227063 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.227088 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.227112 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:20:00.227134 | orchestrator | 2025-03-22 23:20:00.227157 | orchestrator | TASK [common : Copying over /run subdirectories conf] ************************** 2025-03-22 23:20:00.227181 | orchestrator | Saturday 22 March 2025 23:17:28 +0000 (0:00:03.504) 0:00:25.459 ******** 2025-03-22 23:20:00.227204 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:20:00.227226 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:20:00.227240 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:20:00.227254 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:20:00.227268 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:20:00.227282 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:20:00.227296 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:20:00.227309 | orchestrator | 2025-03-22 23:20:00.227323 | orchestrator | TASK [common : Restart systemd-tmpfiles] *************************************** 2025-03-22 23:20:00.227337 | orchestrator | Saturday 22 March 2025 23:17:30 +0000 (0:00:01.859) 0:00:27.319 ******** 2025-03-22 23:20:00.227351 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:20:00.227365 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:20:00.227379 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:20:00.227393 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:20:00.227406 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:20:00.227420 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:20:00.227434 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:20:00.227458 | orchestrator | 2025-03-22 23:20:00.227472 | orchestrator | TASK [common : Ensure fluentd image is present for label check] **************** 2025-03-22 23:20:00.227486 | orchestrator | Saturday 22 March 2025 23:17:32 +0000 (0:00:01.764) 0:00:29.083 ******** 2025-03-22 23:20:00.227500 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:20:00.227579 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.227594 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.227608 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.227622 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.227636 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.227650 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.227664 | orchestrator | 2025-03-22 23:20:00.227678 | orchestrator | TASK [common : Fetch fluentd Docker image labels] ****************************** 2025-03-22 23:20:00.227692 | orchestrator | Saturday 22 March 2025 23:18:04 +0000 (0:00:31.854) 0:01:00.937 ******** 2025-03-22 23:20:00.227706 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:20:00.227720 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:20:00.227734 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:20:00.227748 | orchestrator | ok: [testbed-manager] 2025-03-22 23:20:00.227762 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:20:00.227776 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:20:00.227790 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:20:00.227803 | orchestrator | 2025-03-22 23:20:00.227818 | orchestrator | TASK [common : Set fluentd facts] ********************************************** 2025-03-22 23:20:00.227832 | orchestrator | Saturday 22 March 2025 23:18:09 +0000 (0:00:05.110) 0:01:06.048 ******** 2025-03-22 23:20:00.227846 | orchestrator | ok: [testbed-manager] 2025-03-22 23:20:00.227859 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:20:00.227881 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:20:00.227895 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:20:00.227908 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:20:00.227922 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:20:00.227936 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:20:00.227950 | orchestrator | 2025-03-22 23:20:00.227964 | orchestrator | TASK [common : Fetch fluentd Podman image labels] ****************************** 2025-03-22 23:20:00.227979 | orchestrator | Saturday 22 March 2025 23:18:11 +0000 (0:00:02.370) 0:01:08.419 ******** 2025-03-22 23:20:00.227993 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:20:00.228007 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:20:00.228021 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:20:00.228035 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:20:00.228049 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:20:00.228062 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:20:00.228076 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:20:00.228089 | orchestrator | 2025-03-22 23:20:00.228102 | orchestrator | TASK [common : Set fluentd facts] ********************************************** 2025-03-22 23:20:00.228114 | orchestrator | Saturday 22 March 2025 23:18:14 +0000 (0:00:02.410) 0:01:10.829 ******** 2025-03-22 23:20:00.228127 | orchestrator | skipping: [testbed-manager] 2025-03-22 23:20:00.228139 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:20:00.228152 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:20:00.228164 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:20:00.228177 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:20:00.228189 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:20:00.228201 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:20:00.228214 | orchestrator | 2025-03-22 23:20:00.228226 | orchestrator | TASK [common : Copying over config.json files for services] ******************** 2025-03-22 23:20:00.228239 | orchestrator | Saturday 22 March 2025 23:18:15 +0000 (0:00:01.694) 0:01:12.523 ******** 2025-03-22 23:20:00.228260 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.228312 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.228331 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.228345 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228358 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228371 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.228384 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.228402 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228423 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228440 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228454 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228467 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228480 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.228493 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228521 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.228551 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228565 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228578 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228591 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228604 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228617 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.228629 | orchestrator | 2025-03-22 23:20:00.228642 | orchestrator | TASK [common : Find custom fluentd input config files] ************************* 2025-03-22 23:20:00.228655 | orchestrator | Saturday 22 March 2025 23:18:25 +0000 (0:00:09.799) 0:01:22.323 ******** 2025-03-22 23:20:00.228667 | orchestrator | [WARNING]: Skipped 2025-03-22 23:20:00.228680 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' path due 2025-03-22 23:20:00.228693 | orchestrator | to this access issue: 2025-03-22 23:20:00.228705 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/input' is not a 2025-03-22 23:20:00.228718 | orchestrator | directory 2025-03-22 23:20:00.228730 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-22 23:20:00.228742 | orchestrator | 2025-03-22 23:20:00.228755 | orchestrator | TASK [common : Find custom fluentd filter config files] ************************ 2025-03-22 23:20:00.228773 | orchestrator | Saturday 22 March 2025 23:18:27 +0000 (0:00:01.372) 0:01:23.696 ******** 2025-03-22 23:20:00.228785 | orchestrator | [WARNING]: Skipped 2025-03-22 23:20:00.228798 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' path due 2025-03-22 23:20:00.228810 | orchestrator | to this access issue: 2025-03-22 23:20:00.228823 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/filter' is not a 2025-03-22 23:20:00.228835 | orchestrator | directory 2025-03-22 23:20:00.228847 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-22 23:20:00.228860 | orchestrator | 2025-03-22 23:20:00.228878 | orchestrator | TASK [common : Find custom fluentd format config files] ************************ 2025-03-22 23:20:00.228890 | orchestrator | Saturday 22 March 2025 23:18:28 +0000 (0:00:01.526) 0:01:25.222 ******** 2025-03-22 23:20:00.228903 | orchestrator | [WARNING]: Skipped 2025-03-22 23:20:00.228915 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' path due 2025-03-22 23:20:00.228927 | orchestrator | to this access issue: 2025-03-22 23:20:00.228940 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/format' is not a 2025-03-22 23:20:00.228952 | orchestrator | directory 2025-03-22 23:20:00.228965 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-22 23:20:00.228977 | orchestrator | 2025-03-22 23:20:00.228989 | orchestrator | TASK [common : Find custom fluentd output config files] ************************ 2025-03-22 23:20:00.229007 | orchestrator | Saturday 22 March 2025 23:18:29 +0000 (0:00:00.967) 0:01:26.190 ******** 2025-03-22 23:20:00.229019 | orchestrator | [WARNING]: Skipped 2025-03-22 23:20:00.229032 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' path due 2025-03-22 23:20:00.229044 | orchestrator | to this access issue: 2025-03-22 23:20:00.229057 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/fluentd/output' is not a 2025-03-22 23:20:00.229069 | orchestrator | directory 2025-03-22 23:20:00.229081 | orchestrator | ok: [testbed-manager -> localhost] 2025-03-22 23:20:00.229134 | orchestrator | 2025-03-22 23:20:00.229148 | orchestrator | TASK [common : Copying over td-agent.conf] ************************************* 2025-03-22 23:20:00.229160 | orchestrator | Saturday 22 March 2025 23:18:30 +0000 (0:00:00.758) 0:01:26.949 ******** 2025-03-22 23:20:00.229173 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.229185 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:00.229197 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.229210 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.229223 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.229235 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.229248 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.229260 | orchestrator | 2025-03-22 23:20:00.229273 | orchestrator | TASK [common : Copying over cron logrotate config file] ************************ 2025-03-22 23:20:00.229285 | orchestrator | Saturday 22 March 2025 23:18:39 +0000 (0:00:09.124) 0:01:36.074 ******** 2025-03-22 23:20:00.229298 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-22 23:20:00.229311 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-22 23:20:00.229324 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-22 23:20:00.229336 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-22 23:20:00.229349 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-22 23:20:00.229361 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-22 23:20:00.229374 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/cron-logrotate-global.conf.j2) 2025-03-22 23:20:00.229387 | orchestrator | 2025-03-22 23:20:00.229408 | orchestrator | TASK [common : Ensure RabbitMQ Erlang cookie exists] *************************** 2025-03-22 23:20:00.229439 | orchestrator | Saturday 22 March 2025 23:18:45 +0000 (0:00:06.015) 0:01:42.090 ******** 2025-03-22 23:20:00.229460 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.229480 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:00.229522 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.229546 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.229564 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.229576 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.229588 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.229600 | orchestrator | 2025-03-22 23:20:00.229612 | orchestrator | TASK [common : Ensuring config directories have correct owner and permission] *** 2025-03-22 23:20:00.229625 | orchestrator | Saturday 22 March 2025 23:18:49 +0000 (0:00:03.507) 0:01:45.597 ******** 2025-03-22 23:20:00.229644 | orchestrator | ok: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.229658 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.229671 | orchestrator | ok: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.229708 | orchestrator | ok: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.229722 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.229735 | orchestrator | ok: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.229767 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.229781 | orchestrator | ok: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.229794 | orchestrator | ok: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.229807 | orchestrator | ok: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.229834 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.229848 | orchestrator | ok: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.229861 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.229880 | orchestrator | ok: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.229894 | orchestrator | ok: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.229907 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.229920 | orchestrator | ok: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.229933 | orchestrator | ok: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.229960 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:20:00.229977 | orchestrator | ok: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.229990 | orchestrator | ok: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230009 | orchestrator | 2025-03-22 23:20:00.230054 | orchestrator | TASK [common : Copy rabbitmq-env.conf to kolla toolbox] ************************ 2025-03-22 23:20:00.230068 | orchestrator | Saturday 22 March 2025 23:18:52 +0000 (0:00:03.527) 0:01:49.125 ******** 2025-03-22 23:20:00.230080 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-22 23:20:00.230093 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-22 23:20:00.230106 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-22 23:20:00.230119 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-22 23:20:00.230131 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-22 23:20:00.230143 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-22 23:20:00.230156 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/rabbitmq-env.conf.j2) 2025-03-22 23:20:00.230168 | orchestrator | 2025-03-22 23:20:00.230181 | orchestrator | TASK [common : Copy rabbitmq erl_inetrc to kolla toolbox] ********************** 2025-03-22 23:20:00.230193 | orchestrator | Saturday 22 March 2025 23:18:56 +0000 (0:00:03.595) 0:01:52.720 ******** 2025-03-22 23:20:00.230250 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-22 23:20:00.230264 | orchestrator | changed: [testbed-manager] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-22 23:20:00.230277 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-22 23:20:00.230289 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-22 23:20:00.230302 | orchestrator | changed: [testbed-node-4] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-22 23:20:00.230314 | orchestrator | changed: [testbed-node-5] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-22 23:20:00.230326 | orchestrator | changed: [testbed-node-3] => (item=/ansible/roles/common/templates/erl_inetrc.j2) 2025-03-22 23:20:00.230339 | orchestrator | 2025-03-22 23:20:00.230351 | orchestrator | TASK [common : Check common containers] **************************************** 2025-03-22 23:20:00.230364 | orchestrator | Saturday 22 March 2025 23:18:59 +0000 (0:00:03.147) 0:01:55.867 ******** 2025-03-22 23:20:00.230377 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.230391 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.230411 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.230431 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.230444 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.230457 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230470 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230483 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230496 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230537 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.230559 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230581 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230602 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230623 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/fluentd:5.0.5.20241206', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-03-22 23:20:00.230644 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230667 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230682 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230734 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230748 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/kolla-toolbox:18.3.0.20241206', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230762 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230775 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cron:3.0.20241206', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:20:00.230787 | orchestrator | 2025-03-22 23:20:00.230800 | orchestrator | TASK [common : Creating log volume] ******************************************** 2025-03-22 23:20:00.230812 | orchestrator | Saturday 22 March 2025 23:19:04 +0000 (0:00:05.132) 0:02:01.000 ******** 2025-03-22 23:20:00.230824 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.230837 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:00.230849 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.230862 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.230874 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.230886 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.230898 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.230911 | orchestrator | 2025-03-22 23:20:00.230932 | orchestrator | TASK [common : Link kolla_logs volume to /var/log/kolla] *********************** 2025-03-22 23:20:00.230945 | orchestrator | Saturday 22 March 2025 23:19:07 +0000 (0:00:02.726) 0:02:03.726 ******** 2025-03-22 23:20:00.230957 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.230970 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:00.230982 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.230999 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.231012 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.231024 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.231036 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.231049 | orchestrator | 2025-03-22 23:20:00.231061 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-22 23:20:00.231073 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:02.059) 0:02:05.785 ******** 2025-03-22 23:20:00.231086 | orchestrator | 2025-03-22 23:20:00.231098 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-22 23:20:00.231116 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:00.095) 0:02:05.881 ******** 2025-03-22 23:20:00.231128 | orchestrator | 2025-03-22 23:20:00.231141 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-22 23:20:00.231153 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:00.066) 0:02:05.948 ******** 2025-03-22 23:20:00.231166 | orchestrator | 2025-03-22 23:20:00.231178 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-22 23:20:00.231190 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:00.076) 0:02:06.024 ******** 2025-03-22 23:20:00.231202 | orchestrator | 2025-03-22 23:20:00.231215 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-22 23:20:00.231227 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:00.271) 0:02:06.296 ******** 2025-03-22 23:20:00.231239 | orchestrator | 2025-03-22 23:20:00.231251 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-22 23:20:00.231264 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:00.072) 0:02:06.369 ******** 2025-03-22 23:20:00.231276 | orchestrator | 2025-03-22 23:20:00.231288 | orchestrator | TASK [common : Flush handlers] ************************************************* 2025-03-22 23:20:00.231300 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:00.072) 0:02:06.442 ******** 2025-03-22 23:20:00.231313 | orchestrator | 2025-03-22 23:20:00.231325 | orchestrator | RUNNING HANDLER [common : Restart fluentd container] *************************** 2025-03-22 23:20:00.231337 | orchestrator | Saturday 22 March 2025 23:19:09 +0000 (0:00:00.078) 0:02:06.520 ******** 2025-03-22 23:20:00.231350 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.231367 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.231380 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:00.231392 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.231405 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.231417 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.231429 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.231442 | orchestrator | 2025-03-22 23:20:00.231454 | orchestrator | RUNNING HANDLER [common : Restart kolla-toolbox container] ********************* 2025-03-22 23:20:00.231467 | orchestrator | Saturday 22 March 2025 23:19:20 +0000 (0:00:10.663) 0:02:17.184 ******** 2025-03-22 23:20:00.231479 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:00.231491 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.231528 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.231542 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.231554 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.231567 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.231579 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.231591 | orchestrator | 2025-03-22 23:20:00.231604 | orchestrator | RUNNING HANDLER [common : Initializing toolbox container using normal user] **** 2025-03-22 23:20:00.231617 | orchestrator | Saturday 22 March 2025 23:19:44 +0000 (0:00:24.128) 0:02:41.312 ******** 2025-03-22 23:20:00.231629 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:20:00.231642 | orchestrator | ok: [testbed-manager] 2025-03-22 23:20:00.231654 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:20:00.231667 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:20:00.231679 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:20:00.231691 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:20:00.231704 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:20:00.231716 | orchestrator | 2025-03-22 23:20:00.231729 | orchestrator | RUNNING HANDLER [common : Restart cron container] ****************************** 2025-03-22 23:20:00.231741 | orchestrator | Saturday 22 March 2025 23:19:47 +0000 (0:00:02.435) 0:02:43.748 ******** 2025-03-22 23:20:00.231754 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:00.231767 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:20:00.231779 | orchestrator | changed: [testbed-manager] 2025-03-22 23:20:00.231791 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:00.231804 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:20:00.231822 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:00.231834 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:20:00.231847 | orchestrator | 2025-03-22 23:20:00.231859 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:20:00.231872 | orchestrator | testbed-manager : ok=25  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-22 23:20:00.231886 | orchestrator | testbed-node-0 : ok=21  changed=14  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-22 23:20:00.231898 | orchestrator | testbed-node-1 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-22 23:20:00.231911 | orchestrator | testbed-node-2 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-22 23:20:00.231923 | orchestrator | testbed-node-3 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-22 23:20:00.231936 | orchestrator | testbed-node-4 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-22 23:20:00.231948 | orchestrator | testbed-node-5 : ok=21  changed=15  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-03-22 23:20:00.231960 | orchestrator | 2025-03-22 23:20:00.231973 | orchestrator | 2025-03-22 23:20:00.231985 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:20:00.231997 | orchestrator | Saturday 22 March 2025 23:19:58 +0000 (0:00:11.278) 0:02:55.026 ******** 2025-03-22 23:20:00.232010 | orchestrator | =============================================================================== 2025-03-22 23:20:00.232022 | orchestrator | common : Ensure fluentd image is present for label check --------------- 31.85s 2025-03-22 23:20:00.232034 | orchestrator | common : Restart kolla-toolbox container ------------------------------- 24.13s 2025-03-22 23:20:00.232047 | orchestrator | common : Restart cron container ---------------------------------------- 11.28s 2025-03-22 23:20:00.232059 | orchestrator | common : Restart fluentd container ------------------------------------- 10.66s 2025-03-22 23:20:00.232076 | orchestrator | common : Copying over config.json files for services -------------------- 9.80s 2025-03-22 23:20:00.232089 | orchestrator | common : Copying over td-agent.conf ------------------------------------- 9.12s 2025-03-22 23:20:00.232102 | orchestrator | service-cert-copy : common | Copying over extra CA certificates --------- 7.35s 2025-03-22 23:20:00.232114 | orchestrator | common : Copying over cron logrotate config file ------------------------ 6.02s 2025-03-22 23:20:00.232126 | orchestrator | common : Ensuring config directories exist ------------------------------ 5.61s 2025-03-22 23:20:00.232138 | orchestrator | common : Check common containers ---------------------------------------- 5.13s 2025-03-22 23:20:00.232151 | orchestrator | common : Fetch fluentd Docker image labels ------------------------------ 5.11s 2025-03-22 23:20:00.232163 | orchestrator | service-cert-copy : common | Copying over backend internal TLS certificate --- 3.93s 2025-03-22 23:20:00.232175 | orchestrator | common : Copy rabbitmq-env.conf to kolla toolbox ------------------------ 3.60s 2025-03-22 23:20:00.232188 | orchestrator | common : Ensuring config directories have correct owner and permission --- 3.53s 2025-03-22 23:20:00.232205 | orchestrator | common : Ensure RabbitMQ Erlang cookie exists --------------------------- 3.51s 2025-03-22 23:20:00.232324 | orchestrator | service-cert-copy : common | Copying over backend internal TLS key ------ 3.50s 2025-03-22 23:20:00.232340 | orchestrator | common : Copy rabbitmq erl_inetrc to kolla toolbox ---------------------- 3.15s 2025-03-22 23:20:00.232353 | orchestrator | common : Creating log volume -------------------------------------------- 2.73s 2025-03-22 23:20:00.232365 | orchestrator | common : include_tasks -------------------------------------------------- 2.60s 2025-03-22 23:20:00.232385 | orchestrator | common : Initializing toolbox container using normal user --------------- 2.44s 2025-03-22 23:20:00.232402 | orchestrator | 2025-03-22 23:20:00 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:00.235368 | orchestrator | 2025-03-22 23:20:00 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:00.238593 | orchestrator | 2025-03-22 23:20:00 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:03.300457 | orchestrator | 2025-03-22 23:20:00 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:03.300631 | orchestrator | 2025-03-22 23:20:03 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:03.304591 | orchestrator | 2025-03-22 23:20:03 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:03.305230 | orchestrator | 2025-03-22 23:20:03 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:03.306170 | orchestrator | 2025-03-22 23:20:03 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:03.307205 | orchestrator | 2025-03-22 23:20:03 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:03.308180 | orchestrator | 2025-03-22 23:20:03 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:03.308291 | orchestrator | 2025-03-22 23:20:03 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:06.367580 | orchestrator | 2025-03-22 23:20:06 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:06.372779 | orchestrator | 2025-03-22 23:20:06 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:06.373830 | orchestrator | 2025-03-22 23:20:06 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:06.375013 | orchestrator | 2025-03-22 23:20:06 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:06.376161 | orchestrator | 2025-03-22 23:20:06 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:06.376909 | orchestrator | 2025-03-22 23:20:06 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:09.429676 | orchestrator | 2025-03-22 23:20:06 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:09.429807 | orchestrator | 2025-03-22 23:20:09 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:09.430195 | orchestrator | 2025-03-22 23:20:09 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:09.433120 | orchestrator | 2025-03-22 23:20:09 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:09.437026 | orchestrator | 2025-03-22 23:20:09 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:09.437639 | orchestrator | 2025-03-22 23:20:09 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:09.437669 | orchestrator | 2025-03-22 23:20:09 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:12.489909 | orchestrator | 2025-03-22 23:20:09 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:12.490191 | orchestrator | 2025-03-22 23:20:12 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:12.490283 | orchestrator | 2025-03-22 23:20:12 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:12.491188 | orchestrator | 2025-03-22 23:20:12 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:12.494294 | orchestrator | 2025-03-22 23:20:12 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:15.551938 | orchestrator | 2025-03-22 23:20:12 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:15.552049 | orchestrator | 2025-03-22 23:20:12 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:15.552068 | orchestrator | 2025-03-22 23:20:12 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:15.552101 | orchestrator | 2025-03-22 23:20:15 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:15.552821 | orchestrator | 2025-03-22 23:20:15 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:15.553608 | orchestrator | 2025-03-22 23:20:15 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:15.554981 | orchestrator | 2025-03-22 23:20:15 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:15.555660 | orchestrator | 2025-03-22 23:20:15 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:15.557789 | orchestrator | 2025-03-22 23:20:15 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:18.590677 | orchestrator | 2025-03-22 23:20:15 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:18.590798 | orchestrator | 2025-03-22 23:20:18 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:18.596799 | orchestrator | 2025-03-22 23:20:18 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:18.598310 | orchestrator | 2025-03-22 23:20:18 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:18.600010 | orchestrator | 2025-03-22 23:20:18 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:18.603443 | orchestrator | 2025-03-22 23:20:18 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:18.610695 | orchestrator | 2025-03-22 23:20:18 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:21.661129 | orchestrator | 2025-03-22 23:20:18 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:21.661259 | orchestrator | 2025-03-22 23:20:21 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:21.661944 | orchestrator | 2025-03-22 23:20:21 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:21.662257 | orchestrator | 2025-03-22 23:20:21 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:21.663633 | orchestrator | 2025-03-22 23:20:21 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:21.664817 | orchestrator | 2025-03-22 23:20:21 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:21.668167 | orchestrator | 2025-03-22 23:20:21 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:24.731697 | orchestrator | 2025-03-22 23:20:21 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:24.731837 | orchestrator | 2025-03-22 23:20:24 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:24.733784 | orchestrator | 2025-03-22 23:20:24 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state STARTED 2025-03-22 23:20:24.734422 | orchestrator | 2025-03-22 23:20:24 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:24.735905 | orchestrator | 2025-03-22 23:20:24 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:24.742376 | orchestrator | 2025-03-22 23:20:24 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:24.745925 | orchestrator | 2025-03-22 23:20:24 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:27.830977 | orchestrator | 2025-03-22 23:20:24 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:27.831097 | orchestrator | 2025-03-22 23:20:27 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:27.832002 | orchestrator | 2025-03-22 23:20:27 | INFO  | Task 8b38a85e-a7dc-4cb7-9ade-ba633273fdc2 is in state SUCCESS 2025-03-22 23:20:27.835617 | orchestrator | 2025-03-22 23:20:27 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:27.836796 | orchestrator | 2025-03-22 23:20:27 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:27.838863 | orchestrator | 2025-03-22 23:20:27 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:27.840802 | orchestrator | 2025-03-22 23:20:27 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:30.912365 | orchestrator | 2025-03-22 23:20:27 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:30.912544 | orchestrator | 2025-03-22 23:20:30 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:30.917722 | orchestrator | 2025-03-22 23:20:30 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:30.919736 | orchestrator | 2025-03-22 23:20:30 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:30.922154 | orchestrator | 2025-03-22 23:20:30 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:30.922631 | orchestrator | 2025-03-22 23:20:30 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:30.922666 | orchestrator | 2025-03-22 23:20:30 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:33.972976 | orchestrator | 2025-03-22 23:20:30 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:33.973100 | orchestrator | 2025-03-22 23:20:33 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:33.975638 | orchestrator | 2025-03-22 23:20:33 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:33.975820 | orchestrator | 2025-03-22 23:20:33 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:33.977620 | orchestrator | 2025-03-22 23:20:33 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:33.979600 | orchestrator | 2025-03-22 23:20:33 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:33.981686 | orchestrator | 2025-03-22 23:20:33 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:37.068489 | orchestrator | 2025-03-22 23:20:33 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:37.068652 | orchestrator | 2025-03-22 23:20:37 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:37.069279 | orchestrator | 2025-03-22 23:20:37 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:37.069316 | orchestrator | 2025-03-22 23:20:37 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:37.070099 | orchestrator | 2025-03-22 23:20:37 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:37.071654 | orchestrator | 2025-03-22 23:20:37 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:37.075686 | orchestrator | 2025-03-22 23:20:37 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:40.120291 | orchestrator | 2025-03-22 23:20:37 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:40.120418 | orchestrator | 2025-03-22 23:20:40 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:40.120760 | orchestrator | 2025-03-22 23:20:40 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:40.120789 | orchestrator | 2025-03-22 23:20:40 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:40.120810 | orchestrator | 2025-03-22 23:20:40 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:40.121792 | orchestrator | 2025-03-22 23:20:40 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:40.122564 | orchestrator | 2025-03-22 23:20:40 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:40.122742 | orchestrator | 2025-03-22 23:20:40 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:43.183248 | orchestrator | 2025-03-22 23:20:43 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:43.184409 | orchestrator | 2025-03-22 23:20:43 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:43.187193 | orchestrator | 2025-03-22 23:20:43 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:43.188196 | orchestrator | 2025-03-22 23:20:43 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:43.188900 | orchestrator | 2025-03-22 23:20:43 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:43.190199 | orchestrator | 2025-03-22 23:20:43 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:46.255668 | orchestrator | 2025-03-22 23:20:43 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:46.255785 | orchestrator | 2025-03-22 23:20:46 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:46.256827 | orchestrator | 2025-03-22 23:20:46 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:46.258230 | orchestrator | 2025-03-22 23:20:46 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:46.259248 | orchestrator | 2025-03-22 23:20:46 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state STARTED 2025-03-22 23:20:46.260523 | orchestrator | 2025-03-22 23:20:46 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:46.261859 | orchestrator | 2025-03-22 23:20:46 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:49.321155 | orchestrator | 2025-03-22 23:20:46 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:49.321300 | orchestrator | 2025-03-22 23:20:49 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:49.322575 | orchestrator | 2025-03-22 23:20:49 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:49.322617 | orchestrator | 2025-03-22 23:20:49 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:49.325156 | orchestrator | 2025-03-22 23:20:49.325189 | orchestrator | 2025-03-22 23:20:49.325205 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 23:20:49.325219 | orchestrator | 2025-03-22 23:20:49.325233 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-22 23:20:49.325248 | orchestrator | Saturday 22 March 2025 23:20:06 +0000 (0:00:00.504) 0:00:00.504 ******** 2025-03-22 23:20:49.325262 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:20:49.325278 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:20:49.325292 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:20:49.325306 | orchestrator | 2025-03-22 23:20:49.325320 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-22 23:20:49.325334 | orchestrator | Saturday 22 March 2025 23:20:07 +0000 (0:00:00.685) 0:00:01.190 ******** 2025-03-22 23:20:49.325349 | orchestrator | ok: [testbed-node-0] => (item=enable_memcached_True) 2025-03-22 23:20:49.325363 | orchestrator | ok: [testbed-node-1] => (item=enable_memcached_True) 2025-03-22 23:20:49.325377 | orchestrator | ok: [testbed-node-2] => (item=enable_memcached_True) 2025-03-22 23:20:49.325391 | orchestrator | 2025-03-22 23:20:49.325405 | orchestrator | PLAY [Apply role memcached] **************************************************** 2025-03-22 23:20:49.325419 | orchestrator | 2025-03-22 23:20:49.325433 | orchestrator | TASK [memcached : include_tasks] *********************************************** 2025-03-22 23:20:49.325447 | orchestrator | Saturday 22 March 2025 23:20:07 +0000 (0:00:00.722) 0:00:01.912 ******** 2025-03-22 23:20:49.325461 | orchestrator | included: /ansible/roles/memcached/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:20:49.325476 | orchestrator | 2025-03-22 23:20:49.325490 | orchestrator | TASK [memcached : Ensuring config directories exist] *************************** 2025-03-22 23:20:49.325526 | orchestrator | Saturday 22 March 2025 23:20:09 +0000 (0:00:01.339) 0:00:03.252 ******** 2025-03-22 23:20:49.325542 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-03-22 23:20:49.325557 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-03-22 23:20:49.325571 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-03-22 23:20:49.325584 | orchestrator | 2025-03-22 23:20:49.325598 | orchestrator | TASK [memcached : Copying over config.json files for services] ***************** 2025-03-22 23:20:49.325612 | orchestrator | Saturday 22 March 2025 23:20:11 +0000 (0:00:01.866) 0:00:05.119 ******** 2025-03-22 23:20:49.325626 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-03-22 23:20:49.325640 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-03-22 23:20:49.325654 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-03-22 23:20:49.325667 | orchestrator | 2025-03-22 23:20:49.325681 | orchestrator | TASK [memcached : Check memcached container] *********************************** 2025-03-22 23:20:49.325695 | orchestrator | Saturday 22 March 2025 23:20:13 +0000 (0:00:02.786) 0:00:07.906 ******** 2025-03-22 23:20:49.325709 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:49.325740 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:49.325754 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:49.325768 | orchestrator | 2025-03-22 23:20:49.325782 | orchestrator | RUNNING HANDLER [memcached : Restart memcached container] ********************** 2025-03-22 23:20:49.325796 | orchestrator | Saturday 22 March 2025 23:20:16 +0000 (0:00:02.893) 0:00:10.799 ******** 2025-03-22 23:20:49.325809 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:49.325823 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:49.325837 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:49.325851 | orchestrator | 2025-03-22 23:20:49.325870 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:20:49.325884 | orchestrator | testbed-node-0 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:20:49.325900 | orchestrator | testbed-node-1 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:20:49.325928 | orchestrator | testbed-node-2 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:20:49.325942 | orchestrator | 2025-03-22 23:20:49.325956 | orchestrator | 2025-03-22 23:20:49.325970 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:20:49.325984 | orchestrator | Saturday 22 March 2025 23:20:23 +0000 (0:00:07.021) 0:00:17.821 ******** 2025-03-22 23:20:49.325997 | orchestrator | =============================================================================== 2025-03-22 23:20:49.326011 | orchestrator | memcached : Restart memcached container --------------------------------- 7.03s 2025-03-22 23:20:49.326078 | orchestrator | memcached : Check memcached container ----------------------------------- 2.89s 2025-03-22 23:20:49.326093 | orchestrator | memcached : Copying over config.json files for services ----------------- 2.79s 2025-03-22 23:20:49.326107 | orchestrator | memcached : Ensuring config directories exist --------------------------- 1.87s 2025-03-22 23:20:49.326121 | orchestrator | memcached : include_tasks ----------------------------------------------- 1.34s 2025-03-22 23:20:49.326135 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.72s 2025-03-22 23:20:49.326148 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.69s 2025-03-22 23:20:49.326162 | orchestrator | 2025-03-22 23:20:49.326176 | orchestrator | 2025-03-22 23:20:49.326191 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 23:20:49.326204 | orchestrator | 2025-03-22 23:20:49.326218 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-22 23:20:49.326232 | orchestrator | Saturday 22 March 2025 23:20:05 +0000 (0:00:00.591) 0:00:00.591 ******** 2025-03-22 23:20:49.326246 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:20:49.326260 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:20:49.326274 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:20:49.326288 | orchestrator | 2025-03-22 23:20:49.326302 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-22 23:20:49.326326 | orchestrator | Saturday 22 March 2025 23:20:06 +0000 (0:00:00.782) 0:00:01.374 ******** 2025-03-22 23:20:49.326341 | orchestrator | ok: [testbed-node-0] => (item=enable_redis_True) 2025-03-22 23:20:49.326355 | orchestrator | ok: [testbed-node-1] => (item=enable_redis_True) 2025-03-22 23:20:49.326369 | orchestrator | ok: [testbed-node-2] => (item=enable_redis_True) 2025-03-22 23:20:49.326383 | orchestrator | 2025-03-22 23:20:49.326397 | orchestrator | PLAY [Apply role redis] ******************************************************** 2025-03-22 23:20:49.326411 | orchestrator | 2025-03-22 23:20:49.326424 | orchestrator | TASK [redis : include_tasks] *************************************************** 2025-03-22 23:20:49.326439 | orchestrator | Saturday 22 March 2025 23:20:07 +0000 (0:00:00.839) 0:00:02.214 ******** 2025-03-22 23:20:49.326452 | orchestrator | included: /ansible/roles/redis/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:20:49.326466 | orchestrator | 2025-03-22 23:20:49.326480 | orchestrator | TASK [redis : Ensuring config directories exist] ******************************* 2025-03-22 23:20:49.326494 | orchestrator | Saturday 22 March 2025 23:20:08 +0000 (0:00:01.089) 0:00:03.303 ******** 2025-03-22 23:20:49.326527 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326549 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326573 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326589 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326604 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326634 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326649 | orchestrator | 2025-03-22 23:20:49.326664 | orchestrator | TASK [redis : Copying over default config.json files] ************************** 2025-03-22 23:20:49.326677 | orchestrator | Saturday 22 March 2025 23:20:11 +0000 (0:00:02.940) 0:00:06.244 ******** 2025-03-22 23:20:49.326691 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326706 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326727 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326742 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326757 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326781 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326796 | orchestrator | 2025-03-22 23:20:49.326810 | orchestrator | TASK [redis : Copying over redis config files] ********************************* 2025-03-22 23:20:49.326824 | orchestrator | Saturday 22 March 2025 23:20:15 +0000 (0:00:03.949) 0:00:10.194 ******** 2025-03-22 23:20:49.326838 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326859 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326873 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326888 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326903 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326924 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.326939 | orchestrator | 2025-03-22 23:20:49.326953 | orchestrator | TASK [redis : Check redis containers] ****************************************** 2025-03-22 23:20:49.326966 | orchestrator | Saturday 22 March 2025 23:20:20 +0000 (0:00:04.879) 0:00:15.073 ******** 2025-03-22 23:20:49.326980 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.327001 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.327016 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis:6.0.16.20241206', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.327030 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.327045 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.327065 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/redis-sentinel:6.0.16.20241206', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-03-22 23:20:49.331202 | orchestrator | 2025-03-22 23:20:49.331312 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-03-22 23:20:49.331334 | orchestrator | Saturday 22 March 2025 23:20:23 +0000 (0:00:02.742) 0:00:17.816 ******** 2025-03-22 23:20:49.331349 | orchestrator | 2025-03-22 23:20:49.331364 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-03-22 23:20:49.331378 | orchestrator | Saturday 22 March 2025 23:20:23 +0000 (0:00:00.110) 0:00:17.926 ******** 2025-03-22 23:20:49.331434 | orchestrator | 2025-03-22 23:20:49.331449 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-03-22 23:20:49.331463 | orchestrator | Saturday 22 March 2025 23:20:23 +0000 (0:00:00.134) 0:00:18.060 ******** 2025-03-22 23:20:49.331477 | orchestrator | 2025-03-22 23:20:49.331491 | orchestrator | RUNNING HANDLER [redis : Restart redis container] ****************************** 2025-03-22 23:20:49.331532 | orchestrator | Saturday 22 March 2025 23:20:23 +0000 (0:00:00.337) 0:00:18.398 ******** 2025-03-22 23:20:49.331548 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:49.331563 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:49.331577 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:49.331591 | orchestrator | 2025-03-22 23:20:49.331605 | orchestrator | RUNNING HANDLER [redis : Restart redis-sentinel container] ********************* 2025-03-22 23:20:49.331619 | orchestrator | Saturday 22 March 2025 23:20:35 +0000 (0:00:11.682) 0:00:30.080 ******** 2025-03-22 23:20:49.331633 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:20:49.331647 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:20:49.331661 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:20:49.331675 | orchestrator | 2025-03-22 23:20:49.331689 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:20:49.331704 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:20:49.331720 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:20:49.331734 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:20:49.331748 | orchestrator | 2025-03-22 23:20:49.331764 | orchestrator | 2025-03-22 23:20:49.331780 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:20:49.331796 | orchestrator | Saturday 22 March 2025 23:20:45 +0000 (0:00:10.531) 0:00:40.612 ******** 2025-03-22 23:20:49.331811 | orchestrator | =============================================================================== 2025-03-22 23:20:49.331826 | orchestrator | redis : Restart redis container ---------------------------------------- 11.68s 2025-03-22 23:20:49.331841 | orchestrator | redis : Restart redis-sentinel container ------------------------------- 10.53s 2025-03-22 23:20:49.331856 | orchestrator | redis : Copying over redis config files --------------------------------- 4.88s 2025-03-22 23:20:49.331872 | orchestrator | redis : Copying over default config.json files -------------------------- 3.95s 2025-03-22 23:20:49.331887 | orchestrator | redis : Ensuring config directories exist ------------------------------- 2.94s 2025-03-22 23:20:49.331903 | orchestrator | redis : Check redis containers ------------------------------------------ 2.74s 2025-03-22 23:20:49.331929 | orchestrator | redis : include_tasks --------------------------------------------------- 1.09s 2025-03-22 23:20:49.331958 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.84s 2025-03-22 23:20:49.331981 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.78s 2025-03-22 23:20:49.332005 | orchestrator | redis : Flush handlers -------------------------------------------------- 0.58s 2025-03-22 23:20:49.332030 | orchestrator | 2025-03-22 23:20:49 | INFO  | Task 49e89ba8-daf8-4d1b-aaf3-026339980caf is in state SUCCESS 2025-03-22 23:20:49.332073 | orchestrator | 2025-03-22 23:20:49 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:49.333770 | orchestrator | 2025-03-22 23:20:49 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:52.389192 | orchestrator | 2025-03-22 23:20:49 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:52.389323 | orchestrator | 2025-03-22 23:20:52 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:52.394239 | orchestrator | 2025-03-22 23:20:52 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:52.397233 | orchestrator | 2025-03-22 23:20:52 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:52.398973 | orchestrator | 2025-03-22 23:20:52 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:52.403658 | orchestrator | 2025-03-22 23:20:52 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:55.482463 | orchestrator | 2025-03-22 23:20:52 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:55.482642 | orchestrator | 2025-03-22 23:20:55 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:55.485248 | orchestrator | 2025-03-22 23:20:55 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:55.486413 | orchestrator | 2025-03-22 23:20:55 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:55.491157 | orchestrator | 2025-03-22 23:20:55 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:55.491732 | orchestrator | 2025-03-22 23:20:55 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:20:55.494770 | orchestrator | 2025-03-22 23:20:55 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:20:58.556025 | orchestrator | 2025-03-22 23:20:58 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:20:58.556783 | orchestrator | 2025-03-22 23:20:58 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:20:58.559210 | orchestrator | 2025-03-22 23:20:58 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:20:58.559982 | orchestrator | 2025-03-22 23:20:58 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:20:58.560753 | orchestrator | 2025-03-22 23:20:58 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:01.598825 | orchestrator | 2025-03-22 23:20:58 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:01.598955 | orchestrator | 2025-03-22 23:21:01 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:01.607148 | orchestrator | 2025-03-22 23:21:01 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:01.609537 | orchestrator | 2025-03-22 23:21:01 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:01.609569 | orchestrator | 2025-03-22 23:21:01 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:04.659163 | orchestrator | 2025-03-22 23:21:01 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:04.659271 | orchestrator | 2025-03-22 23:21:01 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:04.659306 | orchestrator | 2025-03-22 23:21:04 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:04.664384 | orchestrator | 2025-03-22 23:21:04 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:04.671494 | orchestrator | 2025-03-22 23:21:04 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:04.675694 | orchestrator | 2025-03-22 23:21:04 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:04.676867 | orchestrator | 2025-03-22 23:21:04 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:07.722879 | orchestrator | 2025-03-22 23:21:04 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:07.723051 | orchestrator | 2025-03-22 23:21:07 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:07.724078 | orchestrator | 2025-03-22 23:21:07 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:07.728351 | orchestrator | 2025-03-22 23:21:07 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:07.730563 | orchestrator | 2025-03-22 23:21:07 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:10.790371 | orchestrator | 2025-03-22 23:21:07 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:10.790477 | orchestrator | 2025-03-22 23:21:07 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:10.790537 | orchestrator | 2025-03-22 23:21:10 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:10.790949 | orchestrator | 2025-03-22 23:21:10 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:10.792160 | orchestrator | 2025-03-22 23:21:10 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:10.793303 | orchestrator | 2025-03-22 23:21:10 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:10.794894 | orchestrator | 2025-03-22 23:21:10 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:13.834942 | orchestrator | 2025-03-22 23:21:10 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:13.835112 | orchestrator | 2025-03-22 23:21:13 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:13.838001 | orchestrator | 2025-03-22 23:21:13 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:13.839217 | orchestrator | 2025-03-22 23:21:13 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:13.839255 | orchestrator | 2025-03-22 23:21:13 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:13.840643 | orchestrator | 2025-03-22 23:21:13 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:16.893484 | orchestrator | 2025-03-22 23:21:13 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:16.893652 | orchestrator | 2025-03-22 23:21:16 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:16.894623 | orchestrator | 2025-03-22 23:21:16 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:16.896849 | orchestrator | 2025-03-22 23:21:16 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:16.898187 | orchestrator | 2025-03-22 23:21:16 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:16.901757 | orchestrator | 2025-03-22 23:21:16 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:16.902389 | orchestrator | 2025-03-22 23:21:16 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:19.976729 | orchestrator | 2025-03-22 23:21:19 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:19.978605 | orchestrator | 2025-03-22 23:21:19 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:19.981342 | orchestrator | 2025-03-22 23:21:19 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:19.984788 | orchestrator | 2025-03-22 23:21:19 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:19.987671 | orchestrator | 2025-03-22 23:21:19 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:23.046731 | orchestrator | 2025-03-22 23:21:19 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:23.046867 | orchestrator | 2025-03-22 23:21:23 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:23.048967 | orchestrator | 2025-03-22 23:21:23 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:23.050760 | orchestrator | 2025-03-22 23:21:23 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:23.062115 | orchestrator | 2025-03-22 23:21:23 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:23.064169 | orchestrator | 2025-03-22 23:21:23 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:26.123363 | orchestrator | 2025-03-22 23:21:23 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:26.123487 | orchestrator | 2025-03-22 23:21:26 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:26.123729 | orchestrator | 2025-03-22 23:21:26 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:26.125677 | orchestrator | 2025-03-22 23:21:26 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:26.126980 | orchestrator | 2025-03-22 23:21:26 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:26.128487 | orchestrator | 2025-03-22 23:21:26 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:26.128793 | orchestrator | 2025-03-22 23:21:26 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:29.191698 | orchestrator | 2025-03-22 23:21:29 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:29.197034 | orchestrator | 2025-03-22 23:21:29 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:29.197079 | orchestrator | 2025-03-22 23:21:29 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:29.197701 | orchestrator | 2025-03-22 23:21:29 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:29.201325 | orchestrator | 2025-03-22 23:21:29 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:32.252311 | orchestrator | 2025-03-22 23:21:29 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:32.252459 | orchestrator | 2025-03-22 23:21:32 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:32.254161 | orchestrator | 2025-03-22 23:21:32 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:32.254664 | orchestrator | 2025-03-22 23:21:32 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:32.258595 | orchestrator | 2025-03-22 23:21:32 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:32.258849 | orchestrator | 2025-03-22 23:21:32 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:35.308222 | orchestrator | 2025-03-22 23:21:32 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:35.308353 | orchestrator | 2025-03-22 23:21:35 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:35.309543 | orchestrator | 2025-03-22 23:21:35 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:35.310712 | orchestrator | 2025-03-22 23:21:35 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:35.310747 | orchestrator | 2025-03-22 23:21:35 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:35.311362 | orchestrator | 2025-03-22 23:21:35 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:38.354725 | orchestrator | 2025-03-22 23:21:35 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:38.354850 | orchestrator | 2025-03-22 23:21:38 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:38.356350 | orchestrator | 2025-03-22 23:21:38 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:38.358072 | orchestrator | 2025-03-22 23:21:38 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:38.359283 | orchestrator | 2025-03-22 23:21:38 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:38.362171 | orchestrator | 2025-03-22 23:21:38 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:41.411248 | orchestrator | 2025-03-22 23:21:38 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:41.411340 | orchestrator | 2025-03-22 23:21:41 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:41.413443 | orchestrator | 2025-03-22 23:21:41 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:41.416390 | orchestrator | 2025-03-22 23:21:41 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:41.418933 | orchestrator | 2025-03-22 23:21:41 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state STARTED 2025-03-22 23:21:41.421144 | orchestrator | 2025-03-22 23:21:41 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:41.421483 | orchestrator | 2025-03-22 23:21:41 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:44.477375 | orchestrator | 2025-03-22 23:21:44 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:44.484458 | orchestrator | 2025-03-22 23:21:44 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:44.485569 | orchestrator | 2025-03-22 23:21:44 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:44.485653 | orchestrator | 2025-03-22 23:21:44 | INFO  | Task 44796b49-f79e-4763-a794-74b3819747d6 is in state SUCCESS 2025-03-22 23:21:44.485682 | orchestrator | 2025-03-22 23:21:44.485698 | orchestrator | 2025-03-22 23:21:44.485713 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 23:21:44.485728 | orchestrator | 2025-03-22 23:21:44.485742 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-22 23:21:44.485756 | orchestrator | Saturday 22 March 2025 23:20:06 +0000 (0:00:00.554) 0:00:00.554 ******** 2025-03-22 23:21:44.485771 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:21:44.485792 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:21:44.485807 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:21:44.485831 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:21:44.485846 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:21:44.485860 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:21:44.485874 | orchestrator | 2025-03-22 23:21:44.485888 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-22 23:21:44.485903 | orchestrator | Saturday 22 March 2025 23:20:07 +0000 (0:00:01.306) 0:00:01.862 ******** 2025-03-22 23:21:44.485917 | orchestrator | ok: [testbed-node-0] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-22 23:21:44.485931 | orchestrator | ok: [testbed-node-1] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-22 23:21:44.485970 | orchestrator | ok: [testbed-node-2] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-22 23:21:44.485985 | orchestrator | ok: [testbed-node-3] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-22 23:21:44.486001 | orchestrator | ok: [testbed-node-4] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-22 23:21:44.486062 | orchestrator | ok: [testbed-node-5] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-03-22 23:21:44.486081 | orchestrator | 2025-03-22 23:21:44.486098 | orchestrator | PLAY [Apply role openvswitch] ************************************************** 2025-03-22 23:21:44.486114 | orchestrator | 2025-03-22 23:21:44.486135 | orchestrator | TASK [openvswitch : include_tasks] ********************************************* 2025-03-22 23:21:44.486150 | orchestrator | Saturday 22 March 2025 23:20:09 +0000 (0:00:01.812) 0:00:03.675 ******** 2025-03-22 23:21:44.486166 | orchestrator | included: /ansible/roles/openvswitch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-03-22 23:21:44.486183 | orchestrator | 2025-03-22 23:21:44.486198 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-03-22 23:21:44.486213 | orchestrator | Saturday 22 March 2025 23:20:12 +0000 (0:00:03.451) 0:00:07.126 ******** 2025-03-22 23:21:44.486229 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-03-22 23:21:44.486245 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-03-22 23:21:44.486260 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-03-22 23:21:44.486275 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-03-22 23:21:44.486291 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-03-22 23:21:44.486307 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-03-22 23:21:44.486322 | orchestrator | 2025-03-22 23:21:44.486337 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-03-22 23:21:44.486353 | orchestrator | Saturday 22 March 2025 23:20:15 +0000 (0:00:02.510) 0:00:09.636 ******** 2025-03-22 23:21:44.486368 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-03-22 23:21:44.486390 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-03-22 23:21:44.486406 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-03-22 23:21:44.486421 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-03-22 23:21:44.486437 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-03-22 23:21:44.486451 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-03-22 23:21:44.486465 | orchestrator | 2025-03-22 23:21:44.486479 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-03-22 23:21:44.486493 | orchestrator | Saturday 22 March 2025 23:20:17 +0000 (0:00:02.479) 0:00:12.116 ******** 2025-03-22 23:21:44.486573 | orchestrator | skipping: [testbed-node-0] => (item=openvswitch)  2025-03-22 23:21:44.486594 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:21:44.486610 | orchestrator | skipping: [testbed-node-1] => (item=openvswitch)  2025-03-22 23:21:44.486624 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:21:44.486637 | orchestrator | skipping: [testbed-node-2] => (item=openvswitch)  2025-03-22 23:21:44.486651 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:21:44.486665 | orchestrator | skipping: [testbed-node-3] => (item=openvswitch)  2025-03-22 23:21:44.486678 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:21:44.486692 | orchestrator | skipping: [testbed-node-4] => (item=openvswitch)  2025-03-22 23:21:44.486706 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:21:44.486720 | orchestrator | skipping: [testbed-node-5] => (item=openvswitch)  2025-03-22 23:21:44.486733 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:21:44.486747 | orchestrator | 2025-03-22 23:21:44.486761 | orchestrator | TASK [openvswitch : Create /run/openvswitch directory on host] ***************** 2025-03-22 23:21:44.486775 | orchestrator | Saturday 22 March 2025 23:20:21 +0000 (0:00:03.859) 0:00:15.976 ******** 2025-03-22 23:21:44.486800 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:21:44.486814 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:21:44.486828 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:21:44.486842 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:21:44.486856 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:21:44.486869 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:21:44.486883 | orchestrator | 2025-03-22 23:21:44.486896 | orchestrator | TASK [openvswitch : Ensuring config directories exist] ************************* 2025-03-22 23:21:44.486910 | orchestrator | Saturday 22 March 2025 23:20:23 +0000 (0:00:01.378) 0:00:17.354 ******** 2025-03-22 23:21:44.486937 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.486959 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.486975 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.486991 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487006 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487037 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487059 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487074 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487089 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487103 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487118 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487146 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487166 | orchestrator | 2025-03-22 23:21:44.487181 | orchestrator | TASK [openvswitch : Copying over config.json files for services] *************** 2025-03-22 23:21:44.487195 | orchestrator | Saturday 22 March 2025 23:20:28 +0000 (0:00:05.063) 0:00:22.418 ******** 2025-03-22 23:21:44.487210 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487225 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487239 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487254 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487275 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487324 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487342 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487357 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487371 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487393 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487424 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487440 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487455 | orchestrator | 2025-03-22 23:21:44.487470 | orchestrator | TASK [openvswitch : Copying over start-ovs file for openvswitch-vswitchd] ****** 2025-03-22 23:21:44.487484 | orchestrator | Saturday 22 March 2025 23:20:33 +0000 (0:00:05.839) 0:00:28.258 ******** 2025-03-22 23:21:44.487498 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:21:44.487539 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:21:44.487555 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:21:44.487569 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:21:44.487583 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:21:44.487597 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:21:44.487611 | orchestrator | 2025-03-22 23:21:44.487625 | orchestrator | TASK [openvswitch : Copying over start-ovsdb-server files for openvswitch-db-server] *** 2025-03-22 23:21:44.487639 | orchestrator | Saturday 22 March 2025 23:20:39 +0000 (0:00:05.936) 0:00:34.194 ******** 2025-03-22 23:21:44.487653 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:21:44.487667 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:21:44.487680 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:21:44.487694 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:21:44.487708 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:21:44.487722 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:21:44.487736 | orchestrator | 2025-03-22 23:21:44.487750 | orchestrator | TASK [openvswitch : Copying over ovs-vsctl wrapper] **************************** 2025-03-22 23:21:44.487764 | orchestrator | Saturday 22 March 2025 23:20:42 +0000 (0:00:02.847) 0:00:37.042 ******** 2025-03-22 23:21:44.487777 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:21:44.487806 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:21:44.487820 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:21:44.487834 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:21:44.487847 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:21:44.487861 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:21:44.487875 | orchestrator | 2025-03-22 23:21:44.487889 | orchestrator | TASK [openvswitch : Check openvswitch containers] ****************************** 2025-03-22 23:21:44.487903 | orchestrator | Saturday 22 March 2025 23:20:45 +0000 (0:00:02.366) 0:00:39.408 ******** 2025-03-22 23:21:44.487917 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487933 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487954 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487980 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.487996 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488023 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488048 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488084 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488166 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488186 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/release/openvswitch-db-server:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488201 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488225 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/release/openvswitch-vswitchd:3.3.0.20241206', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-03-22 23:21:44.488240 | orchestrator | 2025-03-22 23:21:44.488254 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-22 23:21:44.488268 | orchestrator | Saturday 22 March 2025 23:20:48 +0000 (0:00:03.545) 0:00:42.954 ******** 2025-03-22 23:21:44.488282 | orchestrator | 2025-03-22 23:21:44.488296 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-22 23:21:44.488310 | orchestrator | Saturday 22 March 2025 23:20:48 +0000 (0:00:00.216) 0:00:43.171 ******** 2025-03-22 23:21:44.488324 | orchestrator | 2025-03-22 23:21:44.488338 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-22 23:21:44.488352 | orchestrator | Saturday 22 March 2025 23:20:49 +0000 (0:00:00.689) 0:00:43.860 ******** 2025-03-22 23:21:44.488365 | orchestrator | 2025-03-22 23:21:44.488379 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-22 23:21:44.488393 | orchestrator | Saturday 22 March 2025 23:20:49 +0000 (0:00:00.317) 0:00:44.178 ******** 2025-03-22 23:21:44.488407 | orchestrator | 2025-03-22 23:21:44.488421 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-22 23:21:44.488435 | orchestrator | Saturday 22 March 2025 23:20:50 +0000 (0:00:00.403) 0:00:44.582 ******** 2025-03-22 23:21:44.488449 | orchestrator | 2025-03-22 23:21:44.488468 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-03-22 23:21:44.488483 | orchestrator | Saturday 22 March 2025 23:20:50 +0000 (0:00:00.196) 0:00:44.778 ******** 2025-03-22 23:21:44.488497 | orchestrator | 2025-03-22 23:21:44.488552 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-db-server container] ******** 2025-03-22 23:21:44.488569 | orchestrator | Saturday 22 March 2025 23:20:50 +0000 (0:00:00.391) 0:00:45.170 ******** 2025-03-22 23:21:44.488583 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:21:44.488597 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:21:44.488612 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:21:44.488626 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:21:44.488640 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:21:44.488653 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:21:44.488667 | orchestrator | 2025-03-22 23:21:44.488682 | orchestrator | RUNNING HANDLER [openvswitch : Waiting for openvswitch_db service to be ready] *** 2025-03-22 23:21:44.488696 | orchestrator | Saturday 22 March 2025 23:21:02 +0000 (0:00:11.670) 0:00:56.840 ******** 2025-03-22 23:21:44.488717 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:21:44.488732 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:21:44.488746 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:21:44.488760 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:21:44.488774 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:21:44.488787 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:21:44.488813 | orchestrator | 2025-03-22 23:21:44.488827 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2025-03-22 23:21:44.488841 | orchestrator | Saturday 22 March 2025 23:21:06 +0000 (0:00:03.547) 0:01:00.388 ******** 2025-03-22 23:21:44.488856 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:21:44.488869 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:21:44.488883 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:21:44.488897 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:21:44.488912 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:21:44.488934 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:21:44.488950 | orchestrator | 2025-03-22 23:21:44.488964 | orchestrator | TASK [openvswitch : Set system-id, hostname and hw-offload] ******************** 2025-03-22 23:21:44.488979 | orchestrator | Saturday 22 March 2025 23:21:17 +0000 (0:00:11.286) 0:01:11.675 ******** 2025-03-22 23:21:44.488994 | orchestrator | changed: [testbed-node-0] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-0'}) 2025-03-22 23:21:44.489008 | orchestrator | changed: [testbed-node-1] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-1'}) 2025-03-22 23:21:44.489022 | orchestrator | changed: [testbed-node-2] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-2'}) 2025-03-22 23:21:44.489037 | orchestrator | changed: [testbed-node-3] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-3'}) 2025-03-22 23:21:44.489051 | orchestrator | changed: [testbed-node-4] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-4'}) 2025-03-22 23:21:44.489066 | orchestrator | changed: [testbed-node-5] => (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-5'}) 2025-03-22 23:21:44.489080 | orchestrator | changed: [testbed-node-0] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-0'}) 2025-03-22 23:21:44.489094 | orchestrator | changed: [testbed-node-1] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-1'}) 2025-03-22 23:21:44.489108 | orchestrator | changed: [testbed-node-2] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-2'}) 2025-03-22 23:21:44.489122 | orchestrator | changed: [testbed-node-3] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-3'}) 2025-03-22 23:21:44.489136 | orchestrator | changed: [testbed-node-4] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-4'}) 2025-03-22 23:21:44.489151 | orchestrator | changed: [testbed-node-5] => (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-5'}) 2025-03-22 23:21:44.489165 | orchestrator | ok: [testbed-node-0] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-22 23:21:44.489179 | orchestrator | ok: [testbed-node-1] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-22 23:21:44.489193 | orchestrator | ok: [testbed-node-2] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-22 23:21:44.489208 | orchestrator | ok: [testbed-node-3] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-22 23:21:44.489222 | orchestrator | ok: [testbed-node-4] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-22 23:21:44.489236 | orchestrator | ok: [testbed-node-5] => (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) 2025-03-22 23:21:44.489251 | orchestrator | 2025-03-22 23:21:44.489265 | orchestrator | TASK [openvswitch : Ensuring OVS bridge is properly setup] ********************* 2025-03-22 23:21:44.489279 | orchestrator | Saturday 22 March 2025 23:21:26 +0000 (0:00:09.510) 0:01:21.185 ******** 2025-03-22 23:21:44.489293 | orchestrator | skipping: [testbed-node-3] => (item=br-ex)  2025-03-22 23:21:44.489308 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:21:44.489323 | orchestrator | skipping: [testbed-node-4] => (item=br-ex)  2025-03-22 23:21:44.489337 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:21:44.489359 | orchestrator | skipping: [testbed-node-5] => (item=br-ex)  2025-03-22 23:21:44.489373 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:21:44.489387 | orchestrator | changed: [testbed-node-0] => (item=br-ex) 2025-03-22 23:21:44.489401 | orchestrator | changed: [testbed-node-2] => (item=br-ex) 2025-03-22 23:21:44.489416 | orchestrator | changed: [testbed-node-1] => (item=br-ex) 2025-03-22 23:21:44.489430 | orchestrator | 2025-03-22 23:21:44.489444 | orchestrator | TASK [openvswitch : Ensuring OVS ports are properly setup] ********************* 2025-03-22 23:21:44.489458 | orchestrator | Saturday 22 March 2025 23:21:29 +0000 (0:00:02.779) 0:01:23.965 ******** 2025-03-22 23:21:44.489472 | orchestrator | skipping: [testbed-node-3] => (item=['br-ex', 'vxlan0'])  2025-03-22 23:21:44.489486 | orchestrator | skipping: [testbed-node-3] 2025-03-22 23:21:44.489501 | orchestrator | skipping: [testbed-node-4] => (item=['br-ex', 'vxlan0'])  2025-03-22 23:21:44.489539 | orchestrator | skipping: [testbed-node-4] 2025-03-22 23:21:44.489555 | orchestrator | skipping: [testbed-node-5] => (item=['br-ex', 'vxlan0'])  2025-03-22 23:21:44.489568 | orchestrator | skipping: [testbed-node-5] 2025-03-22 23:21:44.489583 | orchestrator | changed: [testbed-node-0] => (item=['br-ex', 'vxlan0']) 2025-03-22 23:21:44.489604 | orchestrator | changed: [testbed-node-1] => (item=['br-ex', 'vxlan0']) 2025-03-22 23:21:44.489717 | orchestrator | changed: [testbed-node-2] => (item=['br-ex', 'vxlan0']) 2025-03-22 23:21:44.489735 | orchestrator | 2025-03-22 23:21:44.489749 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2025-03-22 23:21:44.489763 | orchestrator | Saturday 22 March 2025 23:21:34 +0000 (0:00:04.888) 0:01:28.853 ******** 2025-03-22 23:21:44.489777 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:21:44.489790 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:21:44.489804 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:21:44.489819 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:21:44.489833 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:21:44.489847 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:21:44.489861 | orchestrator | 2025-03-22 23:21:44.489875 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:21:44.489889 | orchestrator | testbed-node-0 : ok=17  changed=13  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 23:21:44.489905 | orchestrator | testbed-node-1 : ok=17  changed=13  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 23:21:44.489919 | orchestrator | testbed-node-2 : ok=17  changed=13  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-03-22 23:21:44.489934 | orchestrator | testbed-node-3 : ok=15  changed=11  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 23:21:44.489947 | orchestrator | testbed-node-4 : ok=15  changed=11  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 23:21:44.489968 | orchestrator | testbed-node-5 : ok=15  changed=11  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-03-22 23:21:44.489982 | orchestrator | 2025-03-22 23:21:44.489996 | orchestrator | 2025-03-22 23:21:44.490010 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:21:44.490056 | orchestrator | Saturday 22 March 2025 23:21:43 +0000 (0:00:09.119) 0:01:37.973 ******** 2025-03-22 23:21:44.490070 | orchestrator | =============================================================================== 2025-03-22 23:21:44.490084 | orchestrator | openvswitch : Restart openvswitch-vswitchd container ------------------- 20.41s 2025-03-22 23:21:44.490098 | orchestrator | openvswitch : Restart openvswitch-db-server container ------------------ 11.67s 2025-03-22 23:21:44.490112 | orchestrator | openvswitch : Set system-id, hostname and hw-offload -------------------- 9.51s 2025-03-22 23:21:44.490135 | orchestrator | openvswitch : Copying over start-ovs file for openvswitch-vswitchd ------ 5.94s 2025-03-22 23:21:44.490149 | orchestrator | openvswitch : Copying over config.json files for services --------------- 5.84s 2025-03-22 23:21:44.490163 | orchestrator | openvswitch : Ensuring config directories exist ------------------------- 5.06s 2025-03-22 23:21:44.490176 | orchestrator | openvswitch : Ensuring OVS ports are properly setup --------------------- 4.89s 2025-03-22 23:21:44.490190 | orchestrator | module-load : Drop module persistence ----------------------------------- 3.86s 2025-03-22 23:21:44.490204 | orchestrator | openvswitch : Waiting for openvswitch_db service to be ready ------------ 3.55s 2025-03-22 23:21:44.490218 | orchestrator | openvswitch : Check openvswitch containers ------------------------------ 3.55s 2025-03-22 23:21:44.490232 | orchestrator | openvswitch : include_tasks --------------------------------------------- 3.45s 2025-03-22 23:21:44.490246 | orchestrator | openvswitch : Copying over start-ovsdb-server files for openvswitch-db-server --- 2.85s 2025-03-22 23:21:44.490265 | orchestrator | openvswitch : Ensuring OVS bridge is properly setup --------------------- 2.78s 2025-03-22 23:21:44.490279 | orchestrator | module-load : Load modules ---------------------------------------------- 2.51s 2025-03-22 23:21:44.490293 | orchestrator | module-load : Persist modules via modules-load.d ------------------------ 2.48s 2025-03-22 23:21:44.490307 | orchestrator | openvswitch : Copying over ovs-vsctl wrapper ---------------------------- 2.37s 2025-03-22 23:21:44.490321 | orchestrator | openvswitch : Flush Handlers -------------------------------------------- 2.22s 2025-03-22 23:21:44.490335 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.81s 2025-03-22 23:21:44.490349 | orchestrator | openvswitch : Create /run/openvswitch directory on host ----------------- 1.38s 2025-03-22 23:21:44.490363 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.31s 2025-03-22 23:21:44.490383 | orchestrator | 2025-03-22 23:21:44 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:47.535095 | orchestrator | 2025-03-22 23:21:44 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:47.535223 | orchestrator | 2025-03-22 23:21:47 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:47.535669 | orchestrator | 2025-03-22 23:21:47 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:21:47.536926 | orchestrator | 2025-03-22 23:21:47 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:47.537344 | orchestrator | 2025-03-22 23:21:47 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:47.538578 | orchestrator | 2025-03-22 23:21:47 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:50.592312 | orchestrator | 2025-03-22 23:21:47 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:50.592450 | orchestrator | 2025-03-22 23:21:50 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:50.593400 | orchestrator | 2025-03-22 23:21:50 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:21:50.594289 | orchestrator | 2025-03-22 23:21:50 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:50.597304 | orchestrator | 2025-03-22 23:21:50 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:53.640721 | orchestrator | 2025-03-22 23:21:50 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:53.640835 | orchestrator | 2025-03-22 23:21:50 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:53.640870 | orchestrator | 2025-03-22 23:21:53 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:53.641115 | orchestrator | 2025-03-22 23:21:53 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:21:53.641973 | orchestrator | 2025-03-22 23:21:53 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:53.643141 | orchestrator | 2025-03-22 23:21:53 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:53.645286 | orchestrator | 2025-03-22 23:21:53 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:56.702952 | orchestrator | 2025-03-22 23:21:53 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:56.703089 | orchestrator | 2025-03-22 23:21:56 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:56.706326 | orchestrator | 2025-03-22 23:21:56 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:21:56.710871 | orchestrator | 2025-03-22 23:21:56 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:56.711694 | orchestrator | 2025-03-22 23:21:56 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:56.714740 | orchestrator | 2025-03-22 23:21:56 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:21:59.771339 | orchestrator | 2025-03-22 23:21:56 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:21:59.771464 | orchestrator | 2025-03-22 23:21:59 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:21:59.774881 | orchestrator | 2025-03-22 23:21:59 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:21:59.774922 | orchestrator | 2025-03-22 23:21:59 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:21:59.778179 | orchestrator | 2025-03-22 23:21:59 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:21:59.780825 | orchestrator | 2025-03-22 23:21:59 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:02.830815 | orchestrator | 2025-03-22 23:21:59 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:02.830962 | orchestrator | 2025-03-22 23:22:02 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:02.831801 | orchestrator | 2025-03-22 23:22:02 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:02.832682 | orchestrator | 2025-03-22 23:22:02 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:02.832714 | orchestrator | 2025-03-22 23:22:02 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:02.833614 | orchestrator | 2025-03-22 23:22:02 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:05.885211 | orchestrator | 2025-03-22 23:22:02 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:05.885321 | orchestrator | 2025-03-22 23:22:05 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:05.885359 | orchestrator | 2025-03-22 23:22:05 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:05.886087 | orchestrator | 2025-03-22 23:22:05 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:05.887245 | orchestrator | 2025-03-22 23:22:05 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:05.887751 | orchestrator | 2025-03-22 23:22:05 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:05.887863 | orchestrator | 2025-03-22 23:22:05 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:08.950948 | orchestrator | 2025-03-22 23:22:08 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:08.951428 | orchestrator | 2025-03-22 23:22:08 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:08.951466 | orchestrator | 2025-03-22 23:22:08 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:08.953615 | orchestrator | 2025-03-22 23:22:08 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:08.955542 | orchestrator | 2025-03-22 23:22:08 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:12.022900 | orchestrator | 2025-03-22 23:22:08 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:12.023027 | orchestrator | 2025-03-22 23:22:12 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:12.023996 | orchestrator | 2025-03-22 23:22:12 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:12.024029 | orchestrator | 2025-03-22 23:22:12 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:12.026200 | orchestrator | 2025-03-22 23:22:12 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:12.027444 | orchestrator | 2025-03-22 23:22:12 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:15.102111 | orchestrator | 2025-03-22 23:22:12 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:15.102259 | orchestrator | 2025-03-22 23:22:15 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:15.102410 | orchestrator | 2025-03-22 23:22:15 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:15.102434 | orchestrator | 2025-03-22 23:22:15 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:15.102449 | orchestrator | 2025-03-22 23:22:15 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:15.102463 | orchestrator | 2025-03-22 23:22:15 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:15.102484 | orchestrator | 2025-03-22 23:22:15 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:18.141204 | orchestrator | 2025-03-22 23:22:18 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:18.144571 | orchestrator | 2025-03-22 23:22:18 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:18.147753 | orchestrator | 2025-03-22 23:22:18 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:18.149806 | orchestrator | 2025-03-22 23:22:18 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:18.151624 | orchestrator | 2025-03-22 23:22:18 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:18.151839 | orchestrator | 2025-03-22 23:22:18 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:21.209072 | orchestrator | 2025-03-22 23:22:21 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:21.210081 | orchestrator | 2025-03-22 23:22:21 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:21.211416 | orchestrator | 2025-03-22 23:22:21 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:21.213818 | orchestrator | 2025-03-22 23:22:21 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:24.261862 | orchestrator | 2025-03-22 23:22:21 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:24.261996 | orchestrator | 2025-03-22 23:22:21 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:24.262081 | orchestrator | 2025-03-22 23:22:24 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:24.265046 | orchestrator | 2025-03-22 23:22:24 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:24.265198 | orchestrator | 2025-03-22 23:22:24 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:24.265236 | orchestrator | 2025-03-22 23:22:24 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:24.269775 | orchestrator | 2025-03-22 23:22:24 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:27.310632 | orchestrator | 2025-03-22 23:22:24 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:27.310761 | orchestrator | 2025-03-22 23:22:27 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:27.311570 | orchestrator | 2025-03-22 23:22:27 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:27.315952 | orchestrator | 2025-03-22 23:22:27 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:27.317666 | orchestrator | 2025-03-22 23:22:27 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:27.320275 | orchestrator | 2025-03-22 23:22:27 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:30.374163 | orchestrator | 2025-03-22 23:22:27 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:30.374293 | orchestrator | 2025-03-22 23:22:30 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:30.374765 | orchestrator | 2025-03-22 23:22:30 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:30.376009 | orchestrator | 2025-03-22 23:22:30 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:30.377102 | orchestrator | 2025-03-22 23:22:30 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:30.378150 | orchestrator | 2025-03-22 23:22:30 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:30.378228 | orchestrator | 2025-03-22 23:22:30 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:33.420747 | orchestrator | 2025-03-22 23:22:33 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:33.421259 | orchestrator | 2025-03-22 23:22:33 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:33.422225 | orchestrator | 2025-03-22 23:22:33 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:33.422992 | orchestrator | 2025-03-22 23:22:33 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:33.423562 | orchestrator | 2025-03-22 23:22:33 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:36.478352 | orchestrator | 2025-03-22 23:22:33 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:36.478475 | orchestrator | 2025-03-22 23:22:36 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:36.480133 | orchestrator | 2025-03-22 23:22:36 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:36.481908 | orchestrator | 2025-03-22 23:22:36 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:36.483405 | orchestrator | 2025-03-22 23:22:36 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:36.485390 | orchestrator | 2025-03-22 23:22:36 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:36.486101 | orchestrator | 2025-03-22 23:22:36 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:39.550161 | orchestrator | 2025-03-22 23:22:39 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:39.558779 | orchestrator | 2025-03-22 23:22:39 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:39.560749 | orchestrator | 2025-03-22 23:22:39 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:39.560785 | orchestrator | 2025-03-22 23:22:39 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:39.566559 | orchestrator | 2025-03-22 23:22:39 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:42.624877 | orchestrator | 2025-03-22 23:22:39 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:42.625015 | orchestrator | 2025-03-22 23:22:42 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:42.629246 | orchestrator | 2025-03-22 23:22:42 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:42.630668 | orchestrator | 2025-03-22 23:22:42 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:42.633672 | orchestrator | 2025-03-22 23:22:42 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:42.638737 | orchestrator | 2025-03-22 23:22:42 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:42.639998 | orchestrator | 2025-03-22 23:22:42 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:45.697273 | orchestrator | 2025-03-22 23:22:45 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:45.699777 | orchestrator | 2025-03-22 23:22:45 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:45.701189 | orchestrator | 2025-03-22 23:22:45 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:45.703098 | orchestrator | 2025-03-22 23:22:45 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:45.704456 | orchestrator | 2025-03-22 23:22:45 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:45.704665 | orchestrator | 2025-03-22 23:22:45 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:48.749837 | orchestrator | 2025-03-22 23:22:48 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:48.752323 | orchestrator | 2025-03-22 23:22:48 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:48.754369 | orchestrator | 2025-03-22 23:22:48 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:48.755999 | orchestrator | 2025-03-22 23:22:48 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:48.758574 | orchestrator | 2025-03-22 23:22:48 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:51.807443 | orchestrator | 2025-03-22 23:22:48 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:51.807597 | orchestrator | 2025-03-22 23:22:51 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:51.809316 | orchestrator | 2025-03-22 23:22:51 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:51.809847 | orchestrator | 2025-03-22 23:22:51 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:51.811095 | orchestrator | 2025-03-22 23:22:51 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:51.823390 | orchestrator | 2025-03-22 23:22:51 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:54.864075 | orchestrator | 2025-03-22 23:22:51 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:54.864211 | orchestrator | 2025-03-22 23:22:54 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:54.864878 | orchestrator | 2025-03-22 23:22:54 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:54.869227 | orchestrator | 2025-03-22 23:22:54 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:54.870119 | orchestrator | 2025-03-22 23:22:54 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:54.870599 | orchestrator | 2025-03-22 23:22:54 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:22:54.870780 | orchestrator | 2025-03-22 23:22:54 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:22:57.927508 | orchestrator | 2025-03-22 23:22:57 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:22:57.927962 | orchestrator | 2025-03-22 23:22:57 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:22:57.929338 | orchestrator | 2025-03-22 23:22:57 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:22:57.929945 | orchestrator | 2025-03-22 23:22:57 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:22:57.930882 | orchestrator | 2025-03-22 23:22:57 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:00.970480 | orchestrator | 2025-03-22 23:22:57 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:00.970657 | orchestrator | 2025-03-22 23:23:00 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:00.970994 | orchestrator | 2025-03-22 23:23:00 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:00.971799 | orchestrator | 2025-03-22 23:23:00 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:23:00.972570 | orchestrator | 2025-03-22 23:23:00 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:00.973968 | orchestrator | 2025-03-22 23:23:00 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:00.974447 | orchestrator | 2025-03-22 23:23:00 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:04.025855 | orchestrator | 2025-03-22 23:23:04 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:04.026627 | orchestrator | 2025-03-22 23:23:04 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:04.034206 | orchestrator | 2025-03-22 23:23:04 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:23:04.036327 | orchestrator | 2025-03-22 23:23:04 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:04.039997 | orchestrator | 2025-03-22 23:23:04 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:04.040243 | orchestrator | 2025-03-22 23:23:04 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:07.090397 | orchestrator | 2025-03-22 23:23:07 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:07.094638 | orchestrator | 2025-03-22 23:23:07 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:10.142750 | orchestrator | 2025-03-22 23:23:07 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:23:10.142870 | orchestrator | 2025-03-22 23:23:07 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:10.142905 | orchestrator | 2025-03-22 23:23:07 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:10.142928 | orchestrator | 2025-03-22 23:23:07 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:10.142962 | orchestrator | 2025-03-22 23:23:10 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:10.143047 | orchestrator | 2025-03-22 23:23:10 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:10.143941 | orchestrator | 2025-03-22 23:23:10 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state STARTED 2025-03-22 23:23:10.144749 | orchestrator | 2025-03-22 23:23:10 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:10.146717 | orchestrator | 2025-03-22 23:23:10 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:10.147133 | orchestrator | 2025-03-22 23:23:10 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:13.192357 | orchestrator | 2025-03-22 23:23:13.192478 | orchestrator | 2025-03-22 23:23:13.192500 | orchestrator | PLAY [Set kolla_action_rabbitmq] *********************************************** 2025-03-22 23:23:13.192570 | orchestrator | 2025-03-22 23:23:13.192589 | orchestrator | TASK [Inform the user about the following task] ******************************** 2025-03-22 23:23:13.192604 | orchestrator | Saturday 22 March 2025 23:20:38 +0000 (0:00:01.143) 0:00:01.143 ******** 2025-03-22 23:23:13.192618 | orchestrator | ok: [localhost] => { 2025-03-22 23:23:13.192635 | orchestrator |  "msg": "The task 'Check RabbitMQ service' fails if the RabbitMQ service has not yet been deployed. This is fine." 2025-03-22 23:23:13.192649 | orchestrator | } 2025-03-22 23:23:13.192664 | orchestrator | 2025-03-22 23:23:13.192678 | orchestrator | TASK [Check RabbitMQ service] ************************************************** 2025-03-22 23:23:13.192692 | orchestrator | Saturday 22 March 2025 23:20:38 +0000 (0:00:00.180) 0:00:01.323 ******** 2025-03-22 23:23:13.192708 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string RabbitMQ Management in 192.168.16.9:15672"} 2025-03-22 23:23:13.192822 | orchestrator | ...ignoring 2025-03-22 23:23:13.192839 | orchestrator | 2025-03-22 23:23:13.192854 | orchestrator | TASK [Set kolla_action_rabbitmq = upgrade if RabbitMQ is already running] ****** 2025-03-22 23:23:13.192868 | orchestrator | Saturday 22 March 2025 23:20:41 +0000 (0:00:02.860) 0:00:04.183 ******** 2025-03-22 23:23:13.192882 | orchestrator | skipping: [localhost] 2025-03-22 23:23:13.192896 | orchestrator | 2025-03-22 23:23:13.192913 | orchestrator | TASK [Set kolla_action_rabbitmq = kolla_action_ng] ***************************** 2025-03-22 23:23:13.192928 | orchestrator | Saturday 22 March 2025 23:20:41 +0000 (0:00:00.127) 0:00:04.310 ******** 2025-03-22 23:23:13.192943 | orchestrator | ok: [localhost] 2025-03-22 23:23:13.192959 | orchestrator | 2025-03-22 23:23:13.192974 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 23:23:13.192989 | orchestrator | 2025-03-22 23:23:13.193005 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-22 23:23:13.193020 | orchestrator | Saturday 22 March 2025 23:20:41 +0000 (0:00:00.164) 0:00:04.475 ******** 2025-03-22 23:23:13.193062 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:23:13.193078 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:23:13.193094 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:23:13.193109 | orchestrator | 2025-03-22 23:23:13.193124 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-22 23:23:13.193140 | orchestrator | Saturday 22 March 2025 23:20:42 +0000 (0:00:00.518) 0:00:04.994 ******** 2025-03-22 23:23:13.193155 | orchestrator | ok: [testbed-node-0] => (item=enable_rabbitmq_True) 2025-03-22 23:23:13.193179 | orchestrator | ok: [testbed-node-1] => (item=enable_rabbitmq_True) 2025-03-22 23:23:13.193203 | orchestrator | ok: [testbed-node-2] => (item=enable_rabbitmq_True) 2025-03-22 23:23:13.193225 | orchestrator | 2025-03-22 23:23:13.193249 | orchestrator | PLAY [Apply role rabbitmq] ***************************************************** 2025-03-22 23:23:13.193272 | orchestrator | 2025-03-22 23:23:13.193294 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-03-22 23:23:13.193317 | orchestrator | Saturday 22 March 2025 23:20:42 +0000 (0:00:00.729) 0:00:05.723 ******** 2025-03-22 23:23:13.193343 | orchestrator | included: /ansible/roles/rabbitmq/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:23:13.193367 | orchestrator | 2025-03-22 23:23:13.193391 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-03-22 23:23:13.193406 | orchestrator | Saturday 22 March 2025 23:20:45 +0000 (0:00:02.232) 0:00:07.956 ******** 2025-03-22 23:23:13.193420 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:23:13.193434 | orchestrator | 2025-03-22 23:23:13.193448 | orchestrator | TASK [rabbitmq : Get current RabbitMQ version] ********************************* 2025-03-22 23:23:13.193466 | orchestrator | Saturday 22 March 2025 23:20:47 +0000 (0:00:01.824) 0:00:09.780 ******** 2025-03-22 23:23:13.193491 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.193553 | orchestrator | 2025-03-22 23:23:13.193582 | orchestrator | TASK [rabbitmq : Get new RabbitMQ version] ************************************* 2025-03-22 23:23:13.193606 | orchestrator | Saturday 22 March 2025 23:20:47 +0000 (0:00:00.662) 0:00:10.443 ******** 2025-03-22 23:23:13.193628 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.193643 | orchestrator | 2025-03-22 23:23:13.193657 | orchestrator | TASK [rabbitmq : Check if running RabbitMQ is at most one version behind] ****** 2025-03-22 23:23:13.193679 | orchestrator | Saturday 22 March 2025 23:20:48 +0000 (0:00:00.903) 0:00:11.346 ******** 2025-03-22 23:23:13.193694 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.193708 | orchestrator | 2025-03-22 23:23:13.193721 | orchestrator | TASK [rabbitmq : Catch when RabbitMQ is being downgraded] ********************** 2025-03-22 23:23:13.193735 | orchestrator | Saturday 22 March 2025 23:20:49 +0000 (0:00:00.698) 0:00:12.045 ******** 2025-03-22 23:23:13.193749 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.193763 | orchestrator | 2025-03-22 23:23:13.193777 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-03-22 23:23:13.193791 | orchestrator | Saturday 22 March 2025 23:20:50 +0000 (0:00:00.832) 0:00:12.878 ******** 2025-03-22 23:23:13.193815 | orchestrator | included: /ansible/roles/rabbitmq/tasks/remove-ha-all-policy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:23:13.193829 | orchestrator | 2025-03-22 23:23:13.193843 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-03-22 23:23:13.193857 | orchestrator | Saturday 22 March 2025 23:20:51 +0000 (0:00:01.752) 0:00:14.630 ******** 2025-03-22 23:23:13.193871 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:23:13.193885 | orchestrator | 2025-03-22 23:23:13.193898 | orchestrator | TASK [rabbitmq : List RabbitMQ policies] *************************************** 2025-03-22 23:23:13.193912 | orchestrator | Saturday 22 March 2025 23:20:53 +0000 (0:00:01.654) 0:00:16.284 ******** 2025-03-22 23:23:13.193926 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.193940 | orchestrator | 2025-03-22 23:23:13.193954 | orchestrator | TASK [rabbitmq : Remove ha-all policy from RabbitMQ] *************************** 2025-03-22 23:23:13.193968 | orchestrator | Saturday 22 March 2025 23:20:54 +0000 (0:00:00.859) 0:00:17.143 ******** 2025-03-22 23:23:13.193993 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.194007 | orchestrator | 2025-03-22 23:23:13.194100 | orchestrator | TASK [rabbitmq : Ensuring config directories exist] **************************** 2025-03-22 23:23:13.194206 | orchestrator | Saturday 22 March 2025 23:20:55 +0000 (0:00:01.072) 0:00:18.216 ******** 2025-03-22 23:23:13.194235 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.194267 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.194292 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.194307 | orchestrator | 2025-03-22 23:23:13.194322 | orchestrator | TASK [rabbitmq : Copying over config.json files for services] ****************** 2025-03-22 23:23:13.194336 | orchestrator | Saturday 22 March 2025 23:20:56 +0000 (0:00:01.274) 0:00:19.491 ******** 2025-03-22 23:23:13.194365 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.194394 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.194409 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.194424 | orchestrator | 2025-03-22 23:23:13.194438 | orchestrator | TASK [rabbitmq : Copying over rabbitmq-env.conf] ******************************* 2025-03-22 23:23:13.194452 | orchestrator | Saturday 22 March 2025 23:20:58 +0000 (0:00:01.843) 0:00:21.334 ******** 2025-03-22 23:23:13.194466 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-03-22 23:23:13.194481 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-03-22 23:23:13.194495 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-03-22 23:23:13.194509 | orchestrator | 2025-03-22 23:23:13.194554 | orchestrator | TASK [rabbitmq : Copying over rabbitmq.conf] *********************************** 2025-03-22 23:23:13.194569 | orchestrator | Saturday 22 March 2025 23:21:01 +0000 (0:00:02.844) 0:00:24.179 ******** 2025-03-22 23:23:13.194583 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-03-22 23:23:13.194606 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-03-22 23:23:13.194620 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-03-22 23:23:13.194634 | orchestrator | 2025-03-22 23:23:13.194648 | orchestrator | TASK [rabbitmq : Copying over erl_inetrc] ************************************** 2025-03-22 23:23:13.194662 | orchestrator | Saturday 22 March 2025 23:21:05 +0000 (0:00:04.461) 0:00:28.640 ******** 2025-03-22 23:23:13.194676 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-03-22 23:23:13.194690 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-03-22 23:23:13.194703 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-03-22 23:23:13.194717 | orchestrator | 2025-03-22 23:23:13.194739 | orchestrator | TASK [rabbitmq : Copying over advanced.config] ********************************* 2025-03-22 23:23:13.194753 | orchestrator | Saturday 22 March 2025 23:21:08 +0000 (0:00:02.991) 0:00:31.632 ******** 2025-03-22 23:23:13.194768 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-03-22 23:23:13.194785 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-03-22 23:23:13.194808 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-03-22 23:23:13.194831 | orchestrator | 2025-03-22 23:23:13.194853 | orchestrator | TASK [rabbitmq : Copying over definitions.json] ******************************** 2025-03-22 23:23:13.194877 | orchestrator | Saturday 22 March 2025 23:21:12 +0000 (0:00:03.712) 0:00:35.345 ******** 2025-03-22 23:23:13.194902 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-03-22 23:23:13.194924 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-03-22 23:23:13.194939 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-03-22 23:23:13.194953 | orchestrator | 2025-03-22 23:23:13.194966 | orchestrator | TASK [rabbitmq : Copying over enabled_plugins] ********************************* 2025-03-22 23:23:13.194980 | orchestrator | Saturday 22 March 2025 23:21:14 +0000 (0:00:01.665) 0:00:37.011 ******** 2025-03-22 23:23:13.194994 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-03-22 23:23:13.195008 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-03-22 23:23:13.195022 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-03-22 23:23:13.195035 | orchestrator | 2025-03-22 23:23:13.195049 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-03-22 23:23:13.195070 | orchestrator | Saturday 22 March 2025 23:21:16 +0000 (0:00:02.066) 0:00:39.077 ******** 2025-03-22 23:23:13.195085 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.195099 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:23:13.195113 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:23:13.195127 | orchestrator | 2025-03-22 23:23:13.195140 | orchestrator | TASK [rabbitmq : Check rabbitmq containers] ************************************ 2025-03-22 23:23:13.195154 | orchestrator | Saturday 22 March 2025 23:21:16 +0000 (0:00:00.694) 0:00:39.772 ******** 2025-03-22 23:23:13.195169 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.195194 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.195220 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:23:13.195236 | orchestrator | 2025-03-22 23:23:13.195250 | orchestrator | TASK [rabbitmq : Creating rabbitmq volume] ************************************* 2025-03-22 23:23:13.195264 | orchestrator | Saturday 22 March 2025 23:21:19 +0000 (0:00:02.170) 0:00:41.942 ******** 2025-03-22 23:23:13.195277 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:23:13.195291 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:23:13.195305 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:23:13.195319 | orchestrator | 2025-03-22 23:23:13.195333 | orchestrator | TASK [rabbitmq : Running RabbitMQ bootstrap container] ************************* 2025-03-22 23:23:13.195347 | orchestrator | Saturday 22 March 2025 23:21:20 +0000 (0:00:01.524) 0:00:43.467 ******** 2025-03-22 23:23:13.195360 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:23:13.195374 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:23:13.195388 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:23:13.195402 | orchestrator | 2025-03-22 23:23:13.195415 | orchestrator | RUNNING HANDLER [rabbitmq : Restart rabbitmq container] ************************ 2025-03-22 23:23:13.195429 | orchestrator | Saturday 22 March 2025 23:21:27 +0000 (0:00:06.681) 0:00:50.149 ******** 2025-03-22 23:23:13.195443 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:23:13.195457 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:23:13.195470 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:23:13.195485 | orchestrator | 2025-03-22 23:23:13.195505 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-03-22 23:23:13.195580 | orchestrator | 2025-03-22 23:23:13.195597 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-03-22 23:23:13.195611 | orchestrator | Saturday 22 March 2025 23:21:27 +0000 (0:00:00.439) 0:00:50.589 ******** 2025-03-22 23:23:13.195624 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:23:13.195639 | orchestrator | 2025-03-22 23:23:13.195652 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-03-22 23:23:13.195666 | orchestrator | Saturday 22 March 2025 23:21:28 +0000 (0:00:00.851) 0:00:51.440 ******** 2025-03-22 23:23:13.195680 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:23:13.195694 | orchestrator | 2025-03-22 23:23:13.195708 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-03-22 23:23:13.195721 | orchestrator | Saturday 22 March 2025 23:21:28 +0000 (0:00:00.280) 0:00:51.721 ******** 2025-03-22 23:23:13.195735 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:23:13.195749 | orchestrator | 2025-03-22 23:23:13.195763 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-03-22 23:23:13.195777 | orchestrator | Saturday 22 March 2025 23:21:31 +0000 (0:00:02.602) 0:00:54.324 ******** 2025-03-22 23:23:13.195790 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:23:13.195804 | orchestrator | 2025-03-22 23:23:13.195818 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-03-22 23:23:13.195832 | orchestrator | 2025-03-22 23:23:13.195845 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-03-22 23:23:13.195859 | orchestrator | Saturday 22 March 2025 23:22:27 +0000 (0:00:55.848) 0:01:50.172 ******** 2025-03-22 23:23:13.195873 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:23:13.195887 | orchestrator | 2025-03-22 23:23:13.195900 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-03-22 23:23:13.195914 | orchestrator | Saturday 22 March 2025 23:22:28 +0000 (0:00:00.607) 0:01:50.780 ******** 2025-03-22 23:23:13.195928 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:23:13.195941 | orchestrator | 2025-03-22 23:23:13.195955 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-03-22 23:23:13.195969 | orchestrator | Saturday 22 March 2025 23:22:28 +0000 (0:00:00.266) 0:01:51.047 ******** 2025-03-22 23:23:13.195983 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:23:13.195997 | orchestrator | 2025-03-22 23:23:13.196010 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-03-22 23:23:13.196024 | orchestrator | Saturday 22 March 2025 23:22:35 +0000 (0:00:06.871) 0:01:57.919 ******** 2025-03-22 23:23:13.196038 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:23:13.196052 | orchestrator | 2025-03-22 23:23:13.196065 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-03-22 23:23:13.196082 | orchestrator | 2025-03-22 23:23:13.196104 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-03-22 23:23:13.196125 | orchestrator | Saturday 22 March 2025 23:22:46 +0000 (0:00:11.377) 0:02:09.296 ******** 2025-03-22 23:23:13.196146 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:23:13.196164 | orchestrator | 2025-03-22 23:23:13.196183 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-03-22 23:23:13.196203 | orchestrator | Saturday 22 March 2025 23:22:47 +0000 (0:00:00.554) 0:02:09.851 ******** 2025-03-22 23:23:13.196222 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:23:13.196240 | orchestrator | 2025-03-22 23:23:13.196267 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-03-22 23:23:13.196297 | orchestrator | Saturday 22 March 2025 23:22:47 +0000 (0:00:00.239) 0:02:10.091 ******** 2025-03-22 23:23:13.196320 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:23:13.196342 | orchestrator | 2025-03-22 23:23:13.196363 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-03-22 23:23:13.196380 | orchestrator | Saturday 22 March 2025 23:22:50 +0000 (0:00:03.184) 0:02:13.275 ******** 2025-03-22 23:23:13.196402 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:23:13.196420 | orchestrator | 2025-03-22 23:23:13.196433 | orchestrator | PLAY [Apply rabbitmq post-configuration] *************************************** 2025-03-22 23:23:13.196445 | orchestrator | 2025-03-22 23:23:13.196458 | orchestrator | TASK [Include rabbitmq post-deploy.yml] **************************************** 2025-03-22 23:23:13.196470 | orchestrator | Saturday 22 March 2025 23:23:06 +0000 (0:00:16.232) 0:02:29.508 ******** 2025-03-22 23:23:13.196482 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:23:13.196494 | orchestrator | 2025-03-22 23:23:13.196507 | orchestrator | TASK [rabbitmq : Enable all stable feature flags] ****************************** 2025-03-22 23:23:13.196545 | orchestrator | Saturday 22 March 2025 23:23:08 +0000 (0:00:01.461) 0:02:30.969 ******** 2025-03-22 23:23:13.196559 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-03-22 23:23:13.196571 | orchestrator | enable_outward_rabbitmq_True 2025-03-22 23:23:13.196584 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-03-22 23:23:13.196596 | orchestrator | outward_rabbitmq_restart 2025-03-22 23:23:13.196609 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:23:13.196621 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:23:13.196633 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:23:13.196645 | orchestrator | 2025-03-22 23:23:13.196658 | orchestrator | PLAY [Apply role rabbitmq (outward)] ******************************************* 2025-03-22 23:23:13.196670 | orchestrator | skipping: no hosts matched 2025-03-22 23:23:13.196682 | orchestrator | 2025-03-22 23:23:13.196694 | orchestrator | PLAY [Restart rabbitmq (outward) services] ************************************* 2025-03-22 23:23:13.196707 | orchestrator | skipping: no hosts matched 2025-03-22 23:23:13.196719 | orchestrator | 2025-03-22 23:23:13.196732 | orchestrator | PLAY [Apply rabbitmq (outward) post-configuration] ***************************** 2025-03-22 23:23:13.196744 | orchestrator | skipping: no hosts matched 2025-03-22 23:23:13.196756 | orchestrator | 2025-03-22 23:23:13.196768 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:23:13.196781 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2025-03-22 23:23:13.196797 | orchestrator | testbed-node-0 : ok=23  changed=14  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-03-22 23:23:13.196819 | orchestrator | testbed-node-1 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:23:13.196840 | orchestrator | testbed-node-2 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-03-22 23:23:13.196860 | orchestrator | 2025-03-22 23:23:13.196882 | orchestrator | 2025-03-22 23:23:13.196902 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:23:13.196920 | orchestrator | Saturday 22 March 2025 23:23:11 +0000 (0:00:03.438) 0:02:34.408 ******** 2025-03-22 23:23:13.196933 | orchestrator | =============================================================================== 2025-03-22 23:23:13.196945 | orchestrator | rabbitmq : Waiting for rabbitmq to start ------------------------------- 83.46s 2025-03-22 23:23:13.196958 | orchestrator | rabbitmq : Restart rabbitmq container ---------------------------------- 12.66s 2025-03-22 23:23:13.196970 | orchestrator | rabbitmq : Running RabbitMQ bootstrap container ------------------------- 6.68s 2025-03-22 23:23:13.196983 | orchestrator | rabbitmq : Copying over rabbitmq.conf ----------------------------------- 4.46s 2025-03-22 23:23:13.196995 | orchestrator | rabbitmq : Copying over advanced.config --------------------------------- 3.71s 2025-03-22 23:23:13.197008 | orchestrator | rabbitmq : Enable all stable feature flags ------------------------------ 3.44s 2025-03-22 23:23:13.197020 | orchestrator | rabbitmq : Copying over erl_inetrc -------------------------------------- 2.99s 2025-03-22 23:23:13.197033 | orchestrator | Check RabbitMQ service -------------------------------------------------- 2.86s 2025-03-22 23:23:13.197101 | orchestrator | rabbitmq : Copying over rabbitmq-env.conf ------------------------------- 2.84s 2025-03-22 23:23:13.197114 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 2.23s 2025-03-22 23:23:13.197126 | orchestrator | rabbitmq : Check rabbitmq containers ------------------------------------ 2.17s 2025-03-22 23:23:13.197139 | orchestrator | rabbitmq : Copying over enabled_plugins --------------------------------- 2.07s 2025-03-22 23:23:13.197151 | orchestrator | rabbitmq : Get info on RabbitMQ container ------------------------------- 2.01s 2025-03-22 23:23:13.197163 | orchestrator | rabbitmq : Copying over config.json files for services ------------------ 1.84s 2025-03-22 23:23:13.197176 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 1.82s 2025-03-22 23:23:13.197194 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 1.75s 2025-03-22 23:23:13.197207 | orchestrator | rabbitmq : Copying over definitions.json -------------------------------- 1.67s 2025-03-22 23:23:13.197219 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 1.65s 2025-03-22 23:23:13.197232 | orchestrator | rabbitmq : Creating rabbitmq volume ------------------------------------- 1.52s 2025-03-22 23:23:13.197244 | orchestrator | Include rabbitmq post-deploy.yml ---------------------------------------- 1.46s 2025-03-22 23:23:13.197264 | orchestrator | 2025-03-22 23:23:13 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:13.197375 | orchestrator | 2025-03-22 23:23:13 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:13.197392 | orchestrator | 2025-03-22 23:23:13 | INFO  | Task 9bf6ced9-dc9d-4ee3-96f9-4682d51fbb9f is in state SUCCESS 2025-03-22 23:23:13.197404 | orchestrator | 2025-03-22 23:23:13 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:13.197421 | orchestrator | 2025-03-22 23:23:13 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:16.247129 | orchestrator | 2025-03-22 23:23:13 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:16.247268 | orchestrator | 2025-03-22 23:23:16 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:16.247665 | orchestrator | 2025-03-22 23:23:16 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:16.247697 | orchestrator | 2025-03-22 23:23:16 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:16.248387 | orchestrator | 2025-03-22 23:23:16 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:19.306976 | orchestrator | 2025-03-22 23:23:16 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:19.307115 | orchestrator | 2025-03-22 23:23:19 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:19.312185 | orchestrator | 2025-03-22 23:23:19 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:19.317131 | orchestrator | 2025-03-22 23:23:19 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:19.323117 | orchestrator | 2025-03-22 23:23:19 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:22.377908 | orchestrator | 2025-03-22 23:23:19 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:22.378094 | orchestrator | 2025-03-22 23:23:22 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:22.380341 | orchestrator | 2025-03-22 23:23:22 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:22.384609 | orchestrator | 2025-03-22 23:23:22 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:22.387217 | orchestrator | 2025-03-22 23:23:22 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:25.438456 | orchestrator | 2025-03-22 23:23:22 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:25.438665 | orchestrator | 2025-03-22 23:23:25 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:25.440610 | orchestrator | 2025-03-22 23:23:25 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:25.441654 | orchestrator | 2025-03-22 23:23:25 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:25.445892 | orchestrator | 2025-03-22 23:23:25 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:28.488834 | orchestrator | 2025-03-22 23:23:25 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:28.488968 | orchestrator | 2025-03-22 23:23:28 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:28.489922 | orchestrator | 2025-03-22 23:23:28 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:28.494874 | orchestrator | 2025-03-22 23:23:28 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:28.496747 | orchestrator | 2025-03-22 23:23:28 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:28.497464 | orchestrator | 2025-03-22 23:23:28 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:31.540154 | orchestrator | 2025-03-22 23:23:31 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:31.541777 | orchestrator | 2025-03-22 23:23:31 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:31.542865 | orchestrator | 2025-03-22 23:23:31 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:31.544472 | orchestrator | 2025-03-22 23:23:31 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:34.596639 | orchestrator | 2025-03-22 23:23:31 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:34.596788 | orchestrator | 2025-03-22 23:23:34 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:34.600450 | orchestrator | 2025-03-22 23:23:34 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:34.603764 | orchestrator | 2025-03-22 23:23:34 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:34.604955 | orchestrator | 2025-03-22 23:23:34 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:34.605142 | orchestrator | 2025-03-22 23:23:34 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:37.655197 | orchestrator | 2025-03-22 23:23:37 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:37.657373 | orchestrator | 2025-03-22 23:23:37 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:37.662558 | orchestrator | 2025-03-22 23:23:37 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:37.667111 | orchestrator | 2025-03-22 23:23:37 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:40.715051 | orchestrator | 2025-03-22 23:23:37 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:40.715181 | orchestrator | 2025-03-22 23:23:40 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:40.716266 | orchestrator | 2025-03-22 23:23:40 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:40.717597 | orchestrator | 2025-03-22 23:23:40 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:40.719399 | orchestrator | 2025-03-22 23:23:40 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:43.763563 | orchestrator | 2025-03-22 23:23:40 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:43.763697 | orchestrator | 2025-03-22 23:23:43 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:43.764836 | orchestrator | 2025-03-22 23:23:43 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:43.767410 | orchestrator | 2025-03-22 23:23:43 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:43.768817 | orchestrator | 2025-03-22 23:23:43 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:43.769049 | orchestrator | 2025-03-22 23:23:43 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:46.840285 | orchestrator | 2025-03-22 23:23:46 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:46.840863 | orchestrator | 2025-03-22 23:23:46 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:46.844861 | orchestrator | 2025-03-22 23:23:46 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:46.845683 | orchestrator | 2025-03-22 23:23:46 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:46.846280 | orchestrator | 2025-03-22 23:23:46 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:49.918791 | orchestrator | 2025-03-22 23:23:49 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:49.919601 | orchestrator | 2025-03-22 23:23:49 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:49.924874 | orchestrator | 2025-03-22 23:23:49 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:49.927396 | orchestrator | 2025-03-22 23:23:49 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:52.975965 | orchestrator | 2025-03-22 23:23:49 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:52.976103 | orchestrator | 2025-03-22 23:23:52 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:52.978181 | orchestrator | 2025-03-22 23:23:52 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:52.978219 | orchestrator | 2025-03-22 23:23:52 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:52.981891 | orchestrator | 2025-03-22 23:23:52 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:56.018486 | orchestrator | 2025-03-22 23:23:52 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:56.018603 | orchestrator | 2025-03-22 23:23:56 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:56.019968 | orchestrator | 2025-03-22 23:23:56 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:56.022855 | orchestrator | 2025-03-22 23:23:56 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:56.024332 | orchestrator | 2025-03-22 23:23:56 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:23:56.024573 | orchestrator | 2025-03-22 23:23:56 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:23:59.082694 | orchestrator | 2025-03-22 23:23:59 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:23:59.087543 | orchestrator | 2025-03-22 23:23:59 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:23:59.090949 | orchestrator | 2025-03-22 23:23:59 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:23:59.091083 | orchestrator | 2025-03-22 23:23:59 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:02.138155 | orchestrator | 2025-03-22 23:23:59 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:02.138277 | orchestrator | 2025-03-22 23:24:02 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:02.139184 | orchestrator | 2025-03-22 23:24:02 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:02.140413 | orchestrator | 2025-03-22 23:24:02 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:02.141836 | orchestrator | 2025-03-22 23:24:02 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:05.197850 | orchestrator | 2025-03-22 23:24:02 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:05.198090 | orchestrator | 2025-03-22 23:24:05 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:05.198186 | orchestrator | 2025-03-22 23:24:05 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:05.201498 | orchestrator | 2025-03-22 23:24:05 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:05.205014 | orchestrator | 2025-03-22 23:24:05 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:08.253588 | orchestrator | 2025-03-22 23:24:05 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:08.253713 | orchestrator | 2025-03-22 23:24:08 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:08.255237 | orchestrator | 2025-03-22 23:24:08 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:08.257107 | orchestrator | 2025-03-22 23:24:08 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:08.258808 | orchestrator | 2025-03-22 23:24:08 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:11.310836 | orchestrator | 2025-03-22 23:24:08 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:11.310951 | orchestrator | 2025-03-22 23:24:11 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:11.312744 | orchestrator | 2025-03-22 23:24:11 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:11.312793 | orchestrator | 2025-03-22 23:24:11 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:11.314812 | orchestrator | 2025-03-22 23:24:11 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:14.363113 | orchestrator | 2025-03-22 23:24:11 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:14.363253 | orchestrator | 2025-03-22 23:24:14 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:14.364472 | orchestrator | 2025-03-22 23:24:14 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:14.366194 | orchestrator | 2025-03-22 23:24:14 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:14.366893 | orchestrator | 2025-03-22 23:24:14 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:17.422635 | orchestrator | 2025-03-22 23:24:14 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:17.422772 | orchestrator | 2025-03-22 23:24:17 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:17.427906 | orchestrator | 2025-03-22 23:24:17 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:17.431291 | orchestrator | 2025-03-22 23:24:17 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:17.433792 | orchestrator | 2025-03-22 23:24:17 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:20.505987 | orchestrator | 2025-03-22 23:24:17 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:20.506245 | orchestrator | 2025-03-22 23:24:20 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:20.508255 | orchestrator | 2025-03-22 23:24:20 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:20.509829 | orchestrator | 2025-03-22 23:24:20 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:20.513097 | orchestrator | 2025-03-22 23:24:20 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:20.513239 | orchestrator | 2025-03-22 23:24:20 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:23.552476 | orchestrator | 2025-03-22 23:24:23 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:23.553567 | orchestrator | 2025-03-22 23:24:23 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:23.554500 | orchestrator | 2025-03-22 23:24:23 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:23.555285 | orchestrator | 2025-03-22 23:24:23 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:26.604047 | orchestrator | 2025-03-22 23:24:23 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:26.604851 | orchestrator | 2025-03-22 23:24:26 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:26.608036 | orchestrator | 2025-03-22 23:24:26 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:26.608083 | orchestrator | 2025-03-22 23:24:26 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:26.609753 | orchestrator | 2025-03-22 23:24:26 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:26.610101 | orchestrator | 2025-03-22 23:24:26 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:29.654585 | orchestrator | 2025-03-22 23:24:29 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:29.655382 | orchestrator | 2025-03-22 23:24:29 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:29.656013 | orchestrator | 2025-03-22 23:24:29 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:29.657902 | orchestrator | 2025-03-22 23:24:29 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:29.661342 | orchestrator | 2025-03-22 23:24:29 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:32.703954 | orchestrator | 2025-03-22 23:24:32 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:32.708927 | orchestrator | 2025-03-22 23:24:32 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state STARTED 2025-03-22 23:24:32.709003 | orchestrator | 2025-03-22 23:24:32 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:32.709966 | orchestrator | 2025-03-22 23:24:32 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:35.769175 | orchestrator | 2025-03-22 23:24:32 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:35.769313 | orchestrator | 2025-03-22 23:24:35 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:35.769724 | orchestrator | 2025-03-22 23:24:35 | INFO  | Task c86e1475-d9ba-4d0e-9128-5567bde924fe is in state SUCCESS 2025-03-22 23:24:35.771280 | orchestrator | 2025-03-22 23:24:35.771325 | orchestrator | 2025-03-22 23:24:35.771344 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 23:24:35.771369 | orchestrator | 2025-03-22 23:24:35.771394 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-22 23:24:35.771419 | orchestrator | Saturday 22 March 2025 23:21:50 +0000 (0:00:00.286) 0:00:00.286 ******** 2025-03-22 23:24:35.771444 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:24:35.771471 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:24:35.771494 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:24:35.771716 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.771743 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.771757 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.771773 | orchestrator | 2025-03-22 23:24:35.771788 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-22 23:24:35.771803 | orchestrator | Saturday 22 March 2025 23:21:51 +0000 (0:00:00.890) 0:00:01.176 ******** 2025-03-22 23:24:35.771817 | orchestrator | ok: [testbed-node-3] => (item=enable_ovn_True) 2025-03-22 23:24:35.771832 | orchestrator | ok: [testbed-node-4] => (item=enable_ovn_True) 2025-03-22 23:24:35.771847 | orchestrator | ok: [testbed-node-5] => (item=enable_ovn_True) 2025-03-22 23:24:35.771861 | orchestrator | ok: [testbed-node-0] => (item=enable_ovn_True) 2025-03-22 23:24:35.771877 | orchestrator | ok: [testbed-node-1] => (item=enable_ovn_True) 2025-03-22 23:24:35.771892 | orchestrator | ok: [testbed-node-2] => (item=enable_ovn_True) 2025-03-22 23:24:35.771908 | orchestrator | 2025-03-22 23:24:35.771923 | orchestrator | PLAY [Apply role ovn-controller] *********************************************** 2025-03-22 23:24:35.771938 | orchestrator | 2025-03-22 23:24:35.771953 | orchestrator | TASK [ovn-controller : include_tasks] ****************************************** 2025-03-22 23:24:35.771969 | orchestrator | Saturday 22 March 2025 23:21:53 +0000 (0:00:01.930) 0:00:03.107 ******** 2025-03-22 23:24:35.771985 | orchestrator | included: /ansible/roles/ovn-controller/tasks/deploy.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:24:35.772002 | orchestrator | 2025-03-22 23:24:35.772018 | orchestrator | TASK [ovn-controller : Ensuring config directories exist] ********************** 2025-03-22 23:24:35.772034 | orchestrator | Saturday 22 March 2025 23:21:56 +0000 (0:00:02.376) 0:00:05.484 ******** 2025-03-22 23:24:35.772050 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772069 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772109 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772126 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772142 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772188 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772204 | orchestrator | 2025-03-22 23:24:35.772220 | orchestrator | TASK [ovn-controller : Copying over config.json files for services] ************ 2025-03-22 23:24:35.772234 | orchestrator | Saturday 22 March 2025 23:21:57 +0000 (0:00:01.675) 0:00:07.159 ******** 2025-03-22 23:24:35.772257 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772271 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772285 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772300 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772314 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772336 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772351 | orchestrator | 2025-03-22 23:24:35.772365 | orchestrator | TASK [ovn-controller : Ensuring systemd override directory exists] ************* 2025-03-22 23:24:35.772379 | orchestrator | Saturday 22 March 2025 23:22:01 +0000 (0:00:03.427) 0:00:10.587 ******** 2025-03-22 23:24:35.772393 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772408 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772438 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772453 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772467 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772482 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772496 | orchestrator | 2025-03-22 23:24:35.772510 | orchestrator | TASK [ovn-controller : Copying over systemd override] ************************** 2025-03-22 23:24:35.772560 | orchestrator | Saturday 22 March 2025 23:22:02 +0000 (0:00:01.743) 0:00:12.330 ******** 2025-03-22 23:24:35.772576 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772590 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772604 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772618 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772632 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772659 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772674 | orchestrator | 2025-03-22 23:24:35.772688 | orchestrator | TASK [ovn-controller : Check ovn-controller containers] ************************ 2025-03-22 23:24:35.772703 | orchestrator | Saturday 22 March 2025 23:22:06 +0000 (0:00:03.378) 0:00:15.709 ******** 2025-03-22 23:24:35.772717 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772731 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772752 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772766 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772780 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772794 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-controller:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.772808 | orchestrator | 2025-03-22 23:24:35.772822 | orchestrator | TASK [ovn-controller : Create br-int bridge on OpenvSwitch] ******************** 2025-03-22 23:24:35.772836 | orchestrator | Saturday 22 March 2025 23:22:08 +0000 (0:00:02.059) 0:00:17.769 ******** 2025-03-22 23:24:35.772850 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:24:35.772865 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:24:35.772879 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:24:35.772893 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.772907 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.772921 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.772935 | orchestrator | 2025-03-22 23:24:35.772949 | orchestrator | TASK [ovn-controller : Configure OVN in OVSDB] ********************************* 2025-03-22 23:24:35.772963 | orchestrator | Saturday 22 March 2025 23:22:12 +0000 (0:00:04.186) 0:00:21.955 ******** 2025-03-22 23:24:35.772977 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.13'}) 2025-03-22 23:24:35.772991 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.14'}) 2025-03-22 23:24:35.773005 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.15'}) 2025-03-22 23:24:35.773024 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.10'}) 2025-03-22 23:24:35.773039 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.12'}) 2025-03-22 23:24:35.773053 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-encap-ip', 'value': '192.168.16.11'}) 2025-03-22 23:24:35.773067 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-22 23:24:35.773081 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-22 23:24:35.773095 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-22 23:24:35.773109 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-22 23:24:35.773129 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-22 23:24:35.773149 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-encap-type', 'value': 'geneve'}) 2025-03-22 23:24:35.773163 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-22 23:24:35.773179 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-22 23:24:35.773193 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-22 23:24:35.773208 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-22 23:24:35.773222 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-22 23:24:35.773236 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-remote', 'value': 'tcp:192.168.16.10:6642,tcp:192.168.16.11:6642,tcp:192.168.16.12:6642'}) 2025-03-22 23:24:35.773250 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-22 23:24:35.773266 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-22 23:24:35.773280 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-22 23:24:35.773294 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-22 23:24:35.773308 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-22 23:24:35.773322 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-22 23:24:35.773341 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-remote-probe-interval', 'value': '60000'}) 2025-03-22 23:24:35.773355 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-22 23:24:35.773369 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-22 23:24:35.773383 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-22 23:24:35.773397 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-22 23:24:35.773411 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-22 23:24:35.773425 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-openflow-probe-interval', 'value': '60'}) 2025-03-22 23:24:35.773439 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-22 23:24:35.773453 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-22 23:24:35.773467 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-22 23:24:35.773481 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-22 23:24:35.773495 | orchestrator | ok: [testbed-node-3] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2025-03-22 23:24:35.773509 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-monitor-all', 'value': False}) 2025-03-22 23:24:35.773553 | orchestrator | ok: [testbed-node-4] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2025-03-22 23:24:35.773582 | orchestrator | ok: [testbed-node-5] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'absent'}) 2025-03-22 23:24:35.773614 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2025-03-22 23:24:35.773635 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2025-03-22 23:24:35.773650 | orchestrator | changed: [testbed-node-3] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:89:18:56', 'state': 'present'}) 2025-03-22 23:24:35.773665 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-bridge-mappings', 'value': 'physnet1:br-ex', 'state': 'present'}) 2025-03-22 23:24:35.773679 | orchestrator | changed: [testbed-node-4] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:2f:fa:44', 'state': 'present'}) 2025-03-22 23:24:35.773693 | orchestrator | changed: [testbed-node-5] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:71:3a:c3', 'state': 'present'}) 2025-03-22 23:24:35.773708 | orchestrator | ok: [testbed-node-2] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:29:4a:9b', 'state': 'absent'}) 2025-03-22 23:24:35.773722 | orchestrator | ok: [testbed-node-0] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:52:c1:40', 'state': 'absent'}) 2025-03-22 23:24:35.773736 | orchestrator | ok: [testbed-node-3] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2025-03-22 23:24:35.773750 | orchestrator | ok: [testbed-node-1] => (item={'name': 'ovn-chassis-mac-mappings', 'value': 'physnet1:52:54:00:33:12:50', 'state': 'absent'}) 2025-03-22 23:24:35.773764 | orchestrator | ok: [testbed-node-4] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2025-03-22 23:24:35.773778 | orchestrator | ok: [testbed-node-5] => (item={'name': 'ovn-cms-options', 'value': '', 'state': 'absent'}) 2025-03-22 23:24:35.773791 | orchestrator | changed: [testbed-node-2] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2025-03-22 23:24:35.773806 | orchestrator | changed: [testbed-node-0] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2025-03-22 23:24:35.773820 | orchestrator | changed: [testbed-node-1] => (item={'name': 'ovn-cms-options', 'value': 'enable-chassis-as-gw,availability-zones=nova', 'state': 'present'}) 2025-03-22 23:24:35.773834 | orchestrator | 2025-03-22 23:24:35.773848 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-22 23:24:35.773862 | orchestrator | Saturday 22 March 2025 23:22:34 +0000 (0:00:22.268) 0:00:44.223 ******** 2025-03-22 23:24:35.773876 | orchestrator | 2025-03-22 23:24:35.773890 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-22 23:24:35.773904 | orchestrator | Saturday 22 March 2025 23:22:34 +0000 (0:00:00.115) 0:00:44.339 ******** 2025-03-22 23:24:35.773918 | orchestrator | 2025-03-22 23:24:35.773932 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-22 23:24:35.773946 | orchestrator | Saturday 22 March 2025 23:22:35 +0000 (0:00:00.586) 0:00:44.925 ******** 2025-03-22 23:24:35.773959 | orchestrator | 2025-03-22 23:24:35.773973 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-22 23:24:35.773987 | orchestrator | Saturday 22 March 2025 23:22:35 +0000 (0:00:00.085) 0:00:45.011 ******** 2025-03-22 23:24:35.774001 | orchestrator | 2025-03-22 23:24:35.774074 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-22 23:24:35.774093 | orchestrator | Saturday 22 March 2025 23:22:35 +0000 (0:00:00.168) 0:00:45.180 ******** 2025-03-22 23:24:35.774107 | orchestrator | 2025-03-22 23:24:35.774121 | orchestrator | TASK [ovn-controller : Flush handlers] ***************************************** 2025-03-22 23:24:35.774135 | orchestrator | Saturday 22 March 2025 23:22:35 +0000 (0:00:00.060) 0:00:45.240 ******** 2025-03-22 23:24:35.774149 | orchestrator | 2025-03-22 23:24:35.774163 | orchestrator | RUNNING HANDLER [ovn-controller : Reload systemd config] *********************** 2025-03-22 23:24:35.774177 | orchestrator | Saturday 22 March 2025 23:22:36 +0000 (0:00:00.278) 0:00:45.519 ******** 2025-03-22 23:24:35.774199 | orchestrator | ok: [testbed-node-4] 2025-03-22 23:24:35.774213 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.774227 | orchestrator | ok: [testbed-node-5] 2025-03-22 23:24:35.774241 | orchestrator | ok: [testbed-node-3] 2025-03-22 23:24:35.774255 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.774269 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.774282 | orchestrator | 2025-03-22 23:24:35.774297 | orchestrator | RUNNING HANDLER [ovn-controller : Restart ovn-controller container] ************ 2025-03-22 23:24:35.774310 | orchestrator | Saturday 22 March 2025 23:22:37 +0000 (0:00:01.875) 0:00:47.395 ******** 2025-03-22 23:24:35.774324 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.774339 | orchestrator | changed: [testbed-node-5] 2025-03-22 23:24:35.774352 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.774366 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.774380 | orchestrator | changed: [testbed-node-3] 2025-03-22 23:24:35.774394 | orchestrator | changed: [testbed-node-4] 2025-03-22 23:24:35.774408 | orchestrator | 2025-03-22 23:24:35.774422 | orchestrator | PLAY [Apply role ovn-db] ******************************************************* 2025-03-22 23:24:35.774436 | orchestrator | 2025-03-22 23:24:35.774451 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2025-03-22 23:24:35.774465 | orchestrator | Saturday 22 March 2025 23:22:55 +0000 (0:00:17.797) 0:01:05.192 ******** 2025-03-22 23:24:35.774479 | orchestrator | included: /ansible/roles/ovn-db/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:24:35.774493 | orchestrator | 2025-03-22 23:24:35.774507 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2025-03-22 23:24:35.774580 | orchestrator | Saturday 22 March 2025 23:22:56 +0000 (0:00:00.923) 0:01:06.115 ******** 2025-03-22 23:24:35.774598 | orchestrator | included: /ansible/roles/ovn-db/tasks/lookup_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:24:35.774612 | orchestrator | 2025-03-22 23:24:35.774633 | orchestrator | TASK [ovn-db : Checking for any existing OVN DB container volumes] ************* 2025-03-22 23:24:35.774648 | orchestrator | Saturday 22 March 2025 23:22:57 +0000 (0:00:01.118) 0:01:07.234 ******** 2025-03-22 23:24:35.774662 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.774676 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.774690 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.774704 | orchestrator | 2025-03-22 23:24:35.774718 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB volume availability] *************** 2025-03-22 23:24:35.774738 | orchestrator | Saturday 22 March 2025 23:22:58 +0000 (0:00:01.027) 0:01:08.261 ******** 2025-03-22 23:24:35.774796 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.774812 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.774826 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.774840 | orchestrator | 2025-03-22 23:24:35.774854 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB volume availability] *************** 2025-03-22 23:24:35.774868 | orchestrator | Saturday 22 March 2025 23:22:59 +0000 (0:00:00.650) 0:01:08.912 ******** 2025-03-22 23:24:35.774882 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.774895 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.774909 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.774923 | orchestrator | 2025-03-22 23:24:35.774937 | orchestrator | TASK [ovn-db : Establish whether the OVN NB cluster has already existed] ******* 2025-03-22 23:24:35.774951 | orchestrator | Saturday 22 March 2025 23:23:00 +0000 (0:00:00.745) 0:01:09.658 ******** 2025-03-22 23:24:35.774965 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.774979 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.774992 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.775006 | orchestrator | 2025-03-22 23:24:35.775020 | orchestrator | TASK [ovn-db : Establish whether the OVN SB cluster has already existed] ******* 2025-03-22 23:24:35.775034 | orchestrator | Saturday 22 March 2025 23:23:00 +0000 (0:00:00.550) 0:01:10.209 ******** 2025-03-22 23:24:35.775051 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.775072 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.775126 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.775141 | orchestrator | 2025-03-22 23:24:35.775153 | orchestrator | TASK [ovn-db : Check if running on all OVN NB DB hosts] ************************ 2025-03-22 23:24:35.775166 | orchestrator | Saturday 22 March 2025 23:23:01 +0000 (0:00:00.533) 0:01:10.742 ******** 2025-03-22 23:24:35.775178 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775191 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775203 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775216 | orchestrator | 2025-03-22 23:24:35.775229 | orchestrator | TASK [ovn-db : Check OVN NB service port liveness] ***************************** 2025-03-22 23:24:35.775241 | orchestrator | Saturday 22 March 2025 23:23:01 +0000 (0:00:00.375) 0:01:11.118 ******** 2025-03-22 23:24:35.775253 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775272 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775284 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775296 | orchestrator | 2025-03-22 23:24:35.775309 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB service port liveness] ************* 2025-03-22 23:24:35.775321 | orchestrator | Saturday 22 March 2025 23:23:02 +0000 (0:00:00.583) 0:01:11.702 ******** 2025-03-22 23:24:35.775333 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775346 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775358 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775370 | orchestrator | 2025-03-22 23:24:35.775383 | orchestrator | TASK [ovn-db : Get OVN NB database information] ******************************** 2025-03-22 23:24:35.775395 | orchestrator | Saturday 22 March 2025 23:23:02 +0000 (0:00:00.500) 0:01:12.202 ******** 2025-03-22 23:24:35.775407 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775420 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775432 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775444 | orchestrator | 2025-03-22 23:24:35.775457 | orchestrator | TASK [ovn-db : Divide hosts by their OVN NB leader/follower role] ************** 2025-03-22 23:24:35.775469 | orchestrator | Saturday 22 March 2025 23:23:03 +0000 (0:00:00.336) 0:01:12.539 ******** 2025-03-22 23:24:35.775481 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775494 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775506 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775574 | orchestrator | 2025-03-22 23:24:35.775590 | orchestrator | TASK [ovn-db : Fail on existing OVN NB cluster with no leader] ***************** 2025-03-22 23:24:35.775603 | orchestrator | Saturday 22 March 2025 23:23:03 +0000 (0:00:00.528) 0:01:13.067 ******** 2025-03-22 23:24:35.775616 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775628 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775641 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775653 | orchestrator | 2025-03-22 23:24:35.775665 | orchestrator | TASK [ovn-db : Check if running on all OVN SB DB hosts] ************************ 2025-03-22 23:24:35.775676 | orchestrator | Saturday 22 March 2025 23:23:04 +0000 (0:00:00.652) 0:01:13.720 ******** 2025-03-22 23:24:35.775686 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775696 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775706 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775716 | orchestrator | 2025-03-22 23:24:35.775726 | orchestrator | TASK [ovn-db : Check OVN SB service port liveness] ***************************** 2025-03-22 23:24:35.775736 | orchestrator | Saturday 22 March 2025 23:23:05 +0000 (0:00:00.823) 0:01:14.543 ******** 2025-03-22 23:24:35.775746 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775756 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775766 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775776 | orchestrator | 2025-03-22 23:24:35.775787 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB service port liveness] ************* 2025-03-22 23:24:35.775797 | orchestrator | Saturday 22 March 2025 23:23:05 +0000 (0:00:00.592) 0:01:15.135 ******** 2025-03-22 23:24:35.775807 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775817 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775827 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775844 | orchestrator | 2025-03-22 23:24:35.775854 | orchestrator | TASK [ovn-db : Get OVN SB database information] ******************************** 2025-03-22 23:24:35.775864 | orchestrator | Saturday 22 March 2025 23:23:06 +0000 (0:00:01.180) 0:01:16.316 ******** 2025-03-22 23:24:35.775874 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775885 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775894 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775904 | orchestrator | 2025-03-22 23:24:35.775920 | orchestrator | TASK [ovn-db : Divide hosts by their OVN SB leader/follower role] ************** 2025-03-22 23:24:35.775931 | orchestrator | Saturday 22 March 2025 23:23:08 +0000 (0:00:01.198) 0:01:17.514 ******** 2025-03-22 23:24:35.775942 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.775952 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.775962 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.775972 | orchestrator | 2025-03-22 23:24:35.775982 | orchestrator | TASK [ovn-db : Fail on existing OVN SB cluster with no leader] ***************** 2025-03-22 23:24:35.775992 | orchestrator | Saturday 22 March 2025 23:23:09 +0000 (0:00:01.480) 0:01:18.995 ******** 2025-03-22 23:24:35.776001 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.776011 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.776022 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.776031 | orchestrator | 2025-03-22 23:24:35.776042 | orchestrator | TASK [ovn-db : include_tasks] ************************************************** 2025-03-22 23:24:35.776056 | orchestrator | Saturday 22 March 2025 23:23:10 +0000 (0:00:00.731) 0:01:19.726 ******** 2025-03-22 23:24:35.776067 | orchestrator | included: /ansible/roles/ovn-db/tasks/bootstrap-initial.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:24:35.776077 | orchestrator | 2025-03-22 23:24:35.776087 | orchestrator | TASK [ovn-db : Set bootstrap args fact for NB (new cluster)] ******************* 2025-03-22 23:24:35.776097 | orchestrator | Saturday 22 March 2025 23:23:11 +0000 (0:00:01.392) 0:01:21.118 ******** 2025-03-22 23:24:35.776107 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.776117 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.776127 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.776137 | orchestrator | 2025-03-22 23:24:35.776147 | orchestrator | TASK [ovn-db : Set bootstrap args fact for SB (new cluster)] ******************* 2025-03-22 23:24:35.776158 | orchestrator | Saturday 22 March 2025 23:23:12 +0000 (0:00:00.978) 0:01:22.097 ******** 2025-03-22 23:24:35.776168 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.776178 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.776188 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.776198 | orchestrator | 2025-03-22 23:24:35.776208 | orchestrator | TASK [ovn-db : Check NB cluster status] **************************************** 2025-03-22 23:24:35.776218 | orchestrator | Saturday 22 March 2025 23:23:13 +0000 (0:00:00.523) 0:01:22.620 ******** 2025-03-22 23:24:35.776229 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.776239 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.776249 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.776259 | orchestrator | 2025-03-22 23:24:35.776269 | orchestrator | TASK [ovn-db : Check SB cluster status] **************************************** 2025-03-22 23:24:35.776279 | orchestrator | Saturday 22 March 2025 23:23:14 +0000 (0:00:00.912) 0:01:23.533 ******** 2025-03-22 23:24:35.776289 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.776299 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.776309 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.776319 | orchestrator | 2025-03-22 23:24:35.776329 | orchestrator | TASK [ovn-db : Remove an old node with the same ip address as the new node in NB DB] *** 2025-03-22 23:24:35.776344 | orchestrator | Saturday 22 March 2025 23:23:15 +0000 (0:00:01.153) 0:01:24.687 ******** 2025-03-22 23:24:35.776354 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.776364 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.776374 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.776385 | orchestrator | 2025-03-22 23:24:35.776395 | orchestrator | TASK [ovn-db : Remove an old node with the same ip address as the new node in SB DB] *** 2025-03-22 23:24:35.776411 | orchestrator | Saturday 22 March 2025 23:23:15 +0000 (0:00:00.732) 0:01:25.419 ******** 2025-03-22 23:24:35.776421 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.776431 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.776441 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.776451 | orchestrator | 2025-03-22 23:24:35.776461 | orchestrator | TASK [ovn-db : Set bootstrap args fact for NB (new member)] ******************** 2025-03-22 23:24:35.776471 | orchestrator | Saturday 22 March 2025 23:23:16 +0000 (0:00:00.752) 0:01:26.172 ******** 2025-03-22 23:24:35.776482 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.776496 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.776506 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.776532 | orchestrator | 2025-03-22 23:24:35.776544 | orchestrator | TASK [ovn-db : Set bootstrap args fact for SB (new member)] ******************** 2025-03-22 23:24:35.776554 | orchestrator | Saturday 22 March 2025 23:23:17 +0000 (0:00:00.981) 0:01:27.153 ******** 2025-03-22 23:24:35.776564 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.776574 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.776584 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.776594 | orchestrator | 2025-03-22 23:24:35.776604 | orchestrator | TASK [ovn-db : Ensuring config directories exist] ****************************** 2025-03-22 23:24:35.776614 | orchestrator | Saturday 22 March 2025 23:23:18 +0000 (0:00:00.940) 0:01:28.094 ******** 2025-03-22 23:24:35.776625 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776637 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776654 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776670 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776680 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776691 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776706 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776717 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776727 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776738 | orchestrator | 2025-03-22 23:24:35.776748 | orchestrator | TASK [ovn-db : Copying over config.json files for services] ******************** 2025-03-22 23:24:35.776758 | orchestrator | Saturday 22 March 2025 23:23:20 +0000 (0:00:02.271) 0:01:30.366 ******** 2025-03-22 23:24:35.776768 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776779 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776789 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776804 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776819 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776829 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776845 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776855 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776865 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776876 | orchestrator | 2025-03-22 23:24:35.776886 | orchestrator | TASK [ovn-db : Check ovn containers] ******************************************* 2025-03-22 23:24:35.776897 | orchestrator | Saturday 22 March 2025 23:23:27 +0000 (0:00:06.387) 0:01:36.753 ******** 2025-03-22 23:24:35.776907 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776920 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776931 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776946 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776957 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776967 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776983 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.776993 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777008 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777018 | orchestrator | 2025-03-22 23:24:35.777028 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-22 23:24:35.777038 | orchestrator | Saturday 22 March 2025 23:23:30 +0000 (0:00:03.222) 0:01:39.975 ******** 2025-03-22 23:24:35.777049 | orchestrator | 2025-03-22 23:24:35.777059 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-22 23:24:35.777069 | orchestrator | Saturday 22 March 2025 23:23:30 +0000 (0:00:00.084) 0:01:40.060 ******** 2025-03-22 23:24:35.777079 | orchestrator | 2025-03-22 23:24:35.777089 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-22 23:24:35.777099 | orchestrator | Saturday 22 March 2025 23:23:30 +0000 (0:00:00.059) 0:01:40.120 ******** 2025-03-22 23:24:35.777110 | orchestrator | 2025-03-22 23:24:35.777120 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-nb-db container] ************************* 2025-03-22 23:24:35.777130 | orchestrator | Saturday 22 March 2025 23:23:30 +0000 (0:00:00.238) 0:01:40.359 ******** 2025-03-22 23:24:35.777140 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.777150 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.777160 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.777170 | orchestrator | 2025-03-22 23:24:35.777180 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db container] ************************* 2025-03-22 23:24:35.777194 | orchestrator | Saturday 22 March 2025 23:23:38 +0000 (0:00:07.633) 0:01:47.992 ******** 2025-03-22 23:24:35.777205 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.777215 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.777225 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.777235 | orchestrator | 2025-03-22 23:24:35.777245 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-northd container] ************************ 2025-03-22 23:24:35.777255 | orchestrator | Saturday 22 March 2025 23:23:41 +0000 (0:00:03.161) 0:01:51.154 ******** 2025-03-22 23:24:35.777265 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.777275 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.777285 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.777296 | orchestrator | 2025-03-22 23:24:35.777306 | orchestrator | TASK [ovn-db : Wait for leader election] *************************************** 2025-03-22 23:24:35.777316 | orchestrator | Saturday 22 March 2025 23:23:49 +0000 (0:00:07.840) 0:01:58.994 ******** 2025-03-22 23:24:35.777326 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.777336 | orchestrator | 2025-03-22 23:24:35.777346 | orchestrator | TASK [ovn-db : Get OVN_Northbound cluster leader] ****************************** 2025-03-22 23:24:35.777365 | orchestrator | Saturday 22 March 2025 23:23:49 +0000 (0:00:00.285) 0:01:59.280 ******** 2025-03-22 23:24:35.777375 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.777385 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.777395 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.777406 | orchestrator | 2025-03-22 23:24:35.777420 | orchestrator | TASK [ovn-db : Configure OVN NB connection settings] *************************** 2025-03-22 23:24:35.777430 | orchestrator | Saturday 22 March 2025 23:23:51 +0000 (0:00:01.237) 0:02:00.518 ******** 2025-03-22 23:24:35.777441 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.777451 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.777461 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.777471 | orchestrator | 2025-03-22 23:24:35.777481 | orchestrator | TASK [ovn-db : Get OVN_Southbound cluster leader] ****************************** 2025-03-22 23:24:35.777491 | orchestrator | Saturday 22 March 2025 23:23:51 +0000 (0:00:00.633) 0:02:01.152 ******** 2025-03-22 23:24:35.777501 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.777511 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.777541 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.777558 | orchestrator | 2025-03-22 23:24:35.777575 | orchestrator | TASK [ovn-db : Configure OVN SB connection settings] *************************** 2025-03-22 23:24:35.777586 | orchestrator | Saturday 22 March 2025 23:23:52 +0000 (0:00:01.102) 0:02:02.254 ******** 2025-03-22 23:24:35.777596 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.777607 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.777617 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.777627 | orchestrator | 2025-03-22 23:24:35.777637 | orchestrator | TASK [ovn-db : Wait for ovn-nb-db] ********************************************* 2025-03-22 23:24:35.777647 | orchestrator | Saturday 22 March 2025 23:23:53 +0000 (0:00:00.688) 0:02:02.943 ******** 2025-03-22 23:24:35.777657 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.777667 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.777677 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.777687 | orchestrator | 2025-03-22 23:24:35.777697 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db] ********************************************* 2025-03-22 23:24:35.777707 | orchestrator | Saturday 22 March 2025 23:23:54 +0000 (0:00:01.220) 0:02:04.164 ******** 2025-03-22 23:24:35.777717 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.777727 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.777738 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.777747 | orchestrator | 2025-03-22 23:24:35.777758 | orchestrator | TASK [ovn-db : Unset bootstrap args fact] ************************************** 2025-03-22 23:24:35.777768 | orchestrator | Saturday 22 March 2025 23:23:55 +0000 (0:00:00.849) 0:02:05.014 ******** 2025-03-22 23:24:35.777778 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.777788 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.777797 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.777807 | orchestrator | 2025-03-22 23:24:35.777817 | orchestrator | TASK [ovn-db : Ensuring config directories exist] ****************************** 2025-03-22 23:24:35.777828 | orchestrator | Saturday 22 March 2025 23:23:56 +0000 (0:00:00.503) 0:02:05.517 ******** 2025-03-22 23:24:35.777838 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777848 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777859 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777875 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777886 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777896 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777912 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777922 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777933 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777943 | orchestrator | 2025-03-22 23:24:35.777953 | orchestrator | TASK [ovn-db : Copying over config.json files for services] ******************** 2025-03-22 23:24:35.777963 | orchestrator | Saturday 22 March 2025 23:23:57 +0000 (0:00:01.706) 0:02:07.224 ******** 2025-03-22 23:24:35.777974 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777984 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.777999 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778037 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778050 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778061 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778077 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778088 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778099 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778109 | orchestrator | 2025-03-22 23:24:35.778119 | orchestrator | TASK [ovn-db : Check ovn containers] ******************************************* 2025-03-22 23:24:35.778129 | orchestrator | Saturday 22 March 2025 23:24:02 +0000 (0:00:04.831) 0:02:12.056 ******** 2025-03-22 23:24:35.778140 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778150 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778166 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-northd', 'value': {'container_name': 'ovn_northd', 'group': 'ovn-northd', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-northd:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-northd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778176 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778191 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778201 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-nb-db', 'value': {'container_name': 'ovn_nb_db', 'group': 'ovn-nb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-nb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-nb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_nb_db:/var/lib/openvswitch/ovn-nb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778215 | orchestrator | ok: [testbed-node-1] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778231 | orchestrator | ok: [testbed-node-2] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778242 | orchestrator | ok: [testbed-node-0] => (item={'key': 'ovn-sb-db', 'value': {'container_name': 'ovn_sb_db', 'group': 'ovn-sb-db', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/ovn-sb-db-server:24.3.4.20241206', 'volumes': ['/etc/kolla/ovn-sb-db/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'ovn_sb_db:/var/lib/openvswitch/ovn-sb/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-03-22 23:24:35.778252 | orchestrator | 2025-03-22 23:24:35.778262 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-22 23:24:35.778272 | orchestrator | Saturday 22 March 2025 23:24:05 +0000 (0:00:03.295) 0:02:15.351 ******** 2025-03-22 23:24:35.778282 | orchestrator | 2025-03-22 23:24:35.778292 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-22 23:24:35.778303 | orchestrator | Saturday 22 March 2025 23:24:06 +0000 (0:00:00.343) 0:02:15.695 ******** 2025-03-22 23:24:35.778313 | orchestrator | 2025-03-22 23:24:35.778323 | orchestrator | TASK [ovn-db : Flush handlers] ************************************************* 2025-03-22 23:24:35.778333 | orchestrator | Saturday 22 March 2025 23:24:06 +0000 (0:00:00.067) 0:02:15.763 ******** 2025-03-22 23:24:35.778343 | orchestrator | 2025-03-22 23:24:35.778353 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-nb-db container] ************************* 2025-03-22 23:24:35.778364 | orchestrator | Saturday 22 March 2025 23:24:06 +0000 (0:00:00.070) 0:02:15.834 ******** 2025-03-22 23:24:35.778379 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.778389 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.778399 | orchestrator | 2025-03-22 23:24:35.778409 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-sb-db container] ************************* 2025-03-22 23:24:35.778419 | orchestrator | Saturday 22 March 2025 23:24:13 +0000 (0:00:07.004) 0:02:22.838 ******** 2025-03-22 23:24:35.778429 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.778439 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.778449 | orchestrator | 2025-03-22 23:24:35.778459 | orchestrator | RUNNING HANDLER [ovn-db : Restart ovn-northd container] ************************ 2025-03-22 23:24:35.778470 | orchestrator | Saturday 22 March 2025 23:24:19 +0000 (0:00:06.473) 0:02:29.312 ******** 2025-03-22 23:24:35.778480 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:24:35.778490 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:24:35.778500 | orchestrator | 2025-03-22 23:24:35.778510 | orchestrator | TASK [ovn-db : Wait for leader election] *************************************** 2025-03-22 23:24:35.778560 | orchestrator | Saturday 22 March 2025 23:24:27 +0000 (0:00:07.319) 0:02:36.631 ******** 2025-03-22 23:24:35.778572 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:24:35.778582 | orchestrator | 2025-03-22 23:24:35.778593 | orchestrator | TASK [ovn-db : Get OVN_Northbound cluster leader] ****************************** 2025-03-22 23:24:35.778603 | orchestrator | Saturday 22 March 2025 23:24:27 +0000 (0:00:00.189) 0:02:36.821 ******** 2025-03-22 23:24:35.778613 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.778623 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.778633 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.778643 | orchestrator | 2025-03-22 23:24:35.778654 | orchestrator | TASK [ovn-db : Configure OVN NB connection settings] *************************** 2025-03-22 23:24:35.778664 | orchestrator | Saturday 22 March 2025 23:24:28 +0000 (0:00:00.885) 0:02:37.707 ******** 2025-03-22 23:24:35.778674 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.778684 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.778692 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.778701 | orchestrator | 2025-03-22 23:24:35.778709 | orchestrator | TASK [ovn-db : Get OVN_Southbound cluster leader] ****************************** 2025-03-22 23:24:35.778718 | orchestrator | Saturday 22 March 2025 23:24:28 +0000 (0:00:00.708) 0:02:38.415 ******** 2025-03-22 23:24:35.778726 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.778735 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.778744 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.778758 | orchestrator | 2025-03-22 23:24:35.778767 | orchestrator | TASK [ovn-db : Configure OVN SB connection settings] *************************** 2025-03-22 23:24:35.778776 | orchestrator | Saturday 22 March 2025 23:24:30 +0000 (0:00:01.390) 0:02:39.805 ******** 2025-03-22 23:24:35.778784 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:24:35.778793 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:24:35.778802 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:24:35.778811 | orchestrator | 2025-03-22 23:24:35.778820 | orchestrator | TASK [ovn-db : Wait for ovn-nb-db] ********************************************* 2025-03-22 23:24:35.778828 | orchestrator | Saturday 22 March 2025 23:24:31 +0000 (0:00:00.814) 0:02:40.620 ******** 2025-03-22 23:24:35.778837 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.778845 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.778853 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.778862 | orchestrator | 2025-03-22 23:24:35.778870 | orchestrator | TASK [ovn-db : Wait for ovn-sb-db] ********************************************* 2025-03-22 23:24:35.778879 | orchestrator | Saturday 22 March 2025 23:24:32 +0000 (0:00:00.928) 0:02:41.549 ******** 2025-03-22 23:24:35.778887 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:24:35.778896 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:24:35.778904 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:24:35.778913 | orchestrator | 2025-03-22 23:24:35.778921 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:24:35.778930 | orchestrator | testbed-node-0 : ok=44  changed=18  unreachable=0 failed=0 skipped=20  rescued=0 ignored=0 2025-03-22 23:24:35.778943 | orchestrator | testbed-node-1 : ok=43  changed=18  unreachable=0 failed=0 skipped=22  rescued=0 ignored=0 2025-03-22 23:24:35.778957 | orchestrator | testbed-node-2 : ok=43  changed=18  unreachable=0 failed=0 skipped=22  rescued=0 ignored=0 2025-03-22 23:24:38.815850 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:24:38.816000 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:24:38.816020 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-03-22 23:24:38.816035 | orchestrator | 2025-03-22 23:24:38.816050 | orchestrator | 2025-03-22 23:24:38.816065 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:24:38.816080 | orchestrator | Saturday 22 March 2025 23:24:33 +0000 (0:00:01.532) 0:02:43.081 ******** 2025-03-22 23:24:38.816095 | orchestrator | =============================================================================== 2025-03-22 23:24:38.816109 | orchestrator | ovn-controller : Configure OVN in OVSDB -------------------------------- 22.27s 2025-03-22 23:24:38.816123 | orchestrator | ovn-controller : Restart ovn-controller container ---------------------- 17.80s 2025-03-22 23:24:38.816137 | orchestrator | ovn-db : Restart ovn-northd container ---------------------------------- 15.16s 2025-03-22 23:24:38.816151 | orchestrator | ovn-db : Restart ovn-nb-db container ----------------------------------- 14.64s 2025-03-22 23:24:38.816164 | orchestrator | ovn-db : Restart ovn-sb-db container ------------------------------------ 9.63s 2025-03-22 23:24:38.816178 | orchestrator | ovn-db : Copying over config.json files for services -------------------- 6.39s 2025-03-22 23:24:38.816192 | orchestrator | ovn-db : Copying over config.json files for services -------------------- 4.83s 2025-03-22 23:24:38.816206 | orchestrator | ovn-controller : Create br-int bridge on OpenvSwitch -------------------- 4.19s 2025-03-22 23:24:38.816229 | orchestrator | ovn-controller : Copying over config.json files for services ------------ 3.43s 2025-03-22 23:24:38.816244 | orchestrator | ovn-controller : Copying over systemd override -------------------------- 3.38s 2025-03-22 23:24:38.816258 | orchestrator | ovn-db : Check ovn containers ------------------------------------------- 3.30s 2025-03-22 23:24:38.816272 | orchestrator | ovn-db : Check ovn containers ------------------------------------------- 3.22s 2025-03-22 23:24:38.816285 | orchestrator | ovn-controller : include_tasks ------------------------------------------ 2.38s 2025-03-22 23:24:38.816300 | orchestrator | ovn-db : Ensuring config directories exist ------------------------------ 2.27s 2025-03-22 23:24:38.816314 | orchestrator | ovn-controller : Check ovn-controller containers ------------------------ 2.06s 2025-03-22 23:24:38.816327 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.93s 2025-03-22 23:24:38.816341 | orchestrator | ovn-controller : Reload systemd config ---------------------------------- 1.88s 2025-03-22 23:24:38.816355 | orchestrator | ovn-controller : Ensuring systemd override directory exists ------------- 1.74s 2025-03-22 23:24:38.816370 | orchestrator | ovn-db : Ensuring config directories exist ------------------------------ 1.71s 2025-03-22 23:24:38.816386 | orchestrator | ovn-controller : Ensuring config directories exist ---------------------- 1.68s 2025-03-22 23:24:38.816402 | orchestrator | 2025-03-22 23:24:35 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:38.816417 | orchestrator | 2025-03-22 23:24:35 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:38.816433 | orchestrator | 2025-03-22 23:24:35 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:38.816464 | orchestrator | 2025-03-22 23:24:38 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:38.816682 | orchestrator | 2025-03-22 23:24:38 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:38.818772 | orchestrator | 2025-03-22 23:24:38 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:41.870693 | orchestrator | 2025-03-22 23:24:38 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:41.870830 | orchestrator | 2025-03-22 23:24:41 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:41.873798 | orchestrator | 2025-03-22 23:24:41 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:41.873832 | orchestrator | 2025-03-22 23:24:41 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:44.931159 | orchestrator | 2025-03-22 23:24:41 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:44.931291 | orchestrator | 2025-03-22 23:24:44 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:44.937169 | orchestrator | 2025-03-22 23:24:44 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:44.938365 | orchestrator | 2025-03-22 23:24:44 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:47.993582 | orchestrator | 2025-03-22 23:24:44 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:47.993702 | orchestrator | 2025-03-22 23:24:47 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:48.000135 | orchestrator | 2025-03-22 23:24:47 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:48.006854 | orchestrator | 2025-03-22 23:24:48 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:48.010913 | orchestrator | 2025-03-22 23:24:48 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:51.077915 | orchestrator | 2025-03-22 23:24:51 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:51.078350 | orchestrator | 2025-03-22 23:24:51 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:51.079734 | orchestrator | 2025-03-22 23:24:51 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:54.142124 | orchestrator | 2025-03-22 23:24:51 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:54.142270 | orchestrator | 2025-03-22 23:24:54 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:54.144449 | orchestrator | 2025-03-22 23:24:54 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:54.144708 | orchestrator | 2025-03-22 23:24:54 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:54.144741 | orchestrator | 2025-03-22 23:24:54 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:24:57.199422 | orchestrator | 2025-03-22 23:24:57 | INFO  | Task ef8d6985-e3b3-4b68-bb95-4064cec21081 is in state STARTED 2025-03-22 23:24:57.201207 | orchestrator | 2025-03-22 23:24:57 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:24:57.203736 | orchestrator | 2025-03-22 23:24:57 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:24:57.207699 | orchestrator | 2025-03-22 23:24:57 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:24:57.209798 | orchestrator | 2025-03-22 23:24:57 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:00.289323 | orchestrator | 2025-03-22 23:25:00 | INFO  | Task ef8d6985-e3b3-4b68-bb95-4064cec21081 is in state STARTED 2025-03-22 23:25:00.290389 | orchestrator | 2025-03-22 23:25:00 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:00.293170 | orchestrator | 2025-03-22 23:25:00 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:00.294662 | orchestrator | 2025-03-22 23:25:00 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:00.294802 | orchestrator | 2025-03-22 23:25:00 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:03.355837 | orchestrator | 2025-03-22 23:25:03 | INFO  | Task ef8d6985-e3b3-4b68-bb95-4064cec21081 is in state STARTED 2025-03-22 23:25:03.356667 | orchestrator | 2025-03-22 23:25:03 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:03.358149 | orchestrator | 2025-03-22 23:25:03 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:03.360609 | orchestrator | 2025-03-22 23:25:03 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:06.420193 | orchestrator | 2025-03-22 23:25:03 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:06.420316 | orchestrator | 2025-03-22 23:25:06 | INFO  | Task ef8d6985-e3b3-4b68-bb95-4064cec21081 is in state STARTED 2025-03-22 23:25:06.420710 | orchestrator | 2025-03-22 23:25:06 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:06.420738 | orchestrator | 2025-03-22 23:25:06 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:06.421735 | orchestrator | 2025-03-22 23:25:06 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:09.476363 | orchestrator | 2025-03-22 23:25:06 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:09.476512 | orchestrator | 2025-03-22 23:25:09 | INFO  | Task ef8d6985-e3b3-4b68-bb95-4064cec21081 is in state STARTED 2025-03-22 23:25:09.480043 | orchestrator | 2025-03-22 23:25:09 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:09.483375 | orchestrator | 2025-03-22 23:25:09 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:09.486410 | orchestrator | 2025-03-22 23:25:09 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:12.533384 | orchestrator | 2025-03-22 23:25:09 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:12.533563 | orchestrator | 2025-03-22 23:25:12 | INFO  | Task ef8d6985-e3b3-4b68-bb95-4064cec21081 is in state SUCCESS 2025-03-22 23:25:12.533795 | orchestrator | 2025-03-22 23:25:12 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:12.534750 | orchestrator | 2025-03-22 23:25:12 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:12.535509 | orchestrator | 2025-03-22 23:25:12 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:12.536679 | orchestrator | 2025-03-22 23:25:12 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:15.586011 | orchestrator | 2025-03-22 23:25:15 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:15.586467 | orchestrator | 2025-03-22 23:25:15 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:15.589859 | orchestrator | 2025-03-22 23:25:15 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:18.651639 | orchestrator | 2025-03-22 23:25:15 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:18.651801 | orchestrator | 2025-03-22 23:25:18 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:18.654097 | orchestrator | 2025-03-22 23:25:18 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:18.655498 | orchestrator | 2025-03-22 23:25:18 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:21.713322 | orchestrator | 2025-03-22 23:25:18 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:21.713469 | orchestrator | 2025-03-22 23:25:21 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:21.715196 | orchestrator | 2025-03-22 23:25:21 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:21.715228 | orchestrator | 2025-03-22 23:25:21 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:21.716398 | orchestrator | 2025-03-22 23:25:21 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:24.763240 | orchestrator | 2025-03-22 23:25:24 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:24.765169 | orchestrator | 2025-03-22 23:25:24 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:24.767248 | orchestrator | 2025-03-22 23:25:24 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:27.822176 | orchestrator | 2025-03-22 23:25:24 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:27.822315 | orchestrator | 2025-03-22 23:25:27 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:27.823057 | orchestrator | 2025-03-22 23:25:27 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:27.824464 | orchestrator | 2025-03-22 23:25:27 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:30.869727 | orchestrator | 2025-03-22 23:25:27 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:30.869864 | orchestrator | 2025-03-22 23:25:30 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:30.870328 | orchestrator | 2025-03-22 23:25:30 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:30.871650 | orchestrator | 2025-03-22 23:25:30 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:33.927359 | orchestrator | 2025-03-22 23:25:30 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:33.927486 | orchestrator | 2025-03-22 23:25:33 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:33.930839 | orchestrator | 2025-03-22 23:25:33 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:36.983888 | orchestrator | 2025-03-22 23:25:33 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:36.984002 | orchestrator | 2025-03-22 23:25:33 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:36.984034 | orchestrator | 2025-03-22 23:25:36 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:36.985684 | orchestrator | 2025-03-22 23:25:36 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:36.985708 | orchestrator | 2025-03-22 23:25:36 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:36.986080 | orchestrator | 2025-03-22 23:25:36 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:40.044679 | orchestrator | 2025-03-22 23:25:40 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:40.047166 | orchestrator | 2025-03-22 23:25:40 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:40.049778 | orchestrator | 2025-03-22 23:25:40 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:43.110237 | orchestrator | 2025-03-22 23:25:40 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:43.110344 | orchestrator | 2025-03-22 23:25:43 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:43.113834 | orchestrator | 2025-03-22 23:25:43 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:46.164242 | orchestrator | 2025-03-22 23:25:43 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:46.164364 | orchestrator | 2025-03-22 23:25:43 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:46.164398 | orchestrator | 2025-03-22 23:25:46 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:46.165421 | orchestrator | 2025-03-22 23:25:46 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:46.166384 | orchestrator | 2025-03-22 23:25:46 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:46.166627 | orchestrator | 2025-03-22 23:25:46 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:49.219243 | orchestrator | 2025-03-22 23:25:49 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:49.219759 | orchestrator | 2025-03-22 23:25:49 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:49.219795 | orchestrator | 2025-03-22 23:25:49 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:52.272146 | orchestrator | 2025-03-22 23:25:49 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:52.272287 | orchestrator | 2025-03-22 23:25:52 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:52.273197 | orchestrator | 2025-03-22 23:25:52 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:52.274358 | orchestrator | 2025-03-22 23:25:52 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:52.275233 | orchestrator | 2025-03-22 23:25:52 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:55.344167 | orchestrator | 2025-03-22 23:25:55 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:25:58.391854 | orchestrator | 2025-03-22 23:25:55 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:25:58.501007 | orchestrator | 2025-03-22 23:25:55 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:25:58.501071 | orchestrator | 2025-03-22 23:25:55 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:25:58.501109 | orchestrator | 2025-03-22 23:25:58 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:01.442242 | orchestrator | 2025-03-22 23:25:58 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:01.442370 | orchestrator | 2025-03-22 23:25:58 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:01.442391 | orchestrator | 2025-03-22 23:25:58 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:01.442428 | orchestrator | 2025-03-22 23:26:01 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:01.442488 | orchestrator | 2025-03-22 23:26:01 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:01.443418 | orchestrator | 2025-03-22 23:26:01 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:04.510340 | orchestrator | 2025-03-22 23:26:01 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:04.510485 | orchestrator | 2025-03-22 23:26:04 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:04.513451 | orchestrator | 2025-03-22 23:26:04 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:04.518636 | orchestrator | 2025-03-22 23:26:04 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:07.555339 | orchestrator | 2025-03-22 23:26:04 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:07.555475 | orchestrator | 2025-03-22 23:26:07 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:07.555776 | orchestrator | 2025-03-22 23:26:07 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:07.556651 | orchestrator | 2025-03-22 23:26:07 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:07.556771 | orchestrator | 2025-03-22 23:26:07 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:10.613019 | orchestrator | 2025-03-22 23:26:10 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:10.613704 | orchestrator | 2025-03-22 23:26:10 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:10.621256 | orchestrator | 2025-03-22 23:26:10 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:13.667792 | orchestrator | 2025-03-22 23:26:10 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:13.667911 | orchestrator | 2025-03-22 23:26:13 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:13.668801 | orchestrator | 2025-03-22 23:26:13 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:13.670395 | orchestrator | 2025-03-22 23:26:13 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:13.670760 | orchestrator | 2025-03-22 23:26:13 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:16.732490 | orchestrator | 2025-03-22 23:26:16 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:16.734305 | orchestrator | 2025-03-22 23:26:16 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:16.737063 | orchestrator | 2025-03-22 23:26:16 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:16.737714 | orchestrator | 2025-03-22 23:26:16 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:19.797742 | orchestrator | 2025-03-22 23:26:19 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:19.805495 | orchestrator | 2025-03-22 23:26:19 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:22.854809 | orchestrator | 2025-03-22 23:26:19 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:22.854931 | orchestrator | 2025-03-22 23:26:19 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:22.854969 | orchestrator | 2025-03-22 23:26:22 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:22.855640 | orchestrator | 2025-03-22 23:26:22 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:22.855708 | orchestrator | 2025-03-22 23:26:22 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:25.914347 | orchestrator | 2025-03-22 23:26:22 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:25.914507 | orchestrator | 2025-03-22 23:26:25 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:25.915827 | orchestrator | 2025-03-22 23:26:25 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:28.955828 | orchestrator | 2025-03-22 23:26:25 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:28.955943 | orchestrator | 2025-03-22 23:26:25 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:28.955978 | orchestrator | 2025-03-22 23:26:28 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:28.957076 | orchestrator | 2025-03-22 23:26:28 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:28.959752 | orchestrator | 2025-03-22 23:26:28 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:32.018011 | orchestrator | 2025-03-22 23:26:28 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:32.018198 | orchestrator | 2025-03-22 23:26:32 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:32.018803 | orchestrator | 2025-03-22 23:26:32 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:32.021313 | orchestrator | 2025-03-22 23:26:32 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:32.021487 | orchestrator | 2025-03-22 23:26:32 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:35.092031 | orchestrator | 2025-03-22 23:26:35 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:38.139266 | orchestrator | 2025-03-22 23:26:35 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:38.139373 | orchestrator | 2025-03-22 23:26:35 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:38.139392 | orchestrator | 2025-03-22 23:26:35 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:38.139424 | orchestrator | 2025-03-22 23:26:38 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:38.140985 | orchestrator | 2025-03-22 23:26:38 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:38.142859 | orchestrator | 2025-03-22 23:26:38 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:41.205309 | orchestrator | 2025-03-22 23:26:38 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:41.205428 | orchestrator | 2025-03-22 23:26:41 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:41.207278 | orchestrator | 2025-03-22 23:26:41 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:41.209432 | orchestrator | 2025-03-22 23:26:41 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:41.211262 | orchestrator | 2025-03-22 23:26:41 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:44.274625 | orchestrator | 2025-03-22 23:26:44 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:44.274977 | orchestrator | 2025-03-22 23:26:44 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:44.280611 | orchestrator | 2025-03-22 23:26:44 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:47.332626 | orchestrator | 2025-03-22 23:26:44 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:47.332756 | orchestrator | 2025-03-22 23:26:47 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:47.336482 | orchestrator | 2025-03-22 23:26:47 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:47.337668 | orchestrator | 2025-03-22 23:26:47 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:50.398694 | orchestrator | 2025-03-22 23:26:47 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:50.398842 | orchestrator | 2025-03-22 23:26:50 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:50.400733 | orchestrator | 2025-03-22 23:26:50 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:50.403616 | orchestrator | 2025-03-22 23:26:50 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:53.451097 | orchestrator | 2025-03-22 23:26:50 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:53.451226 | orchestrator | 2025-03-22 23:26:53 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:53.452074 | orchestrator | 2025-03-22 23:26:53 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:53.453617 | orchestrator | 2025-03-22 23:26:53 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:53.453935 | orchestrator | 2025-03-22 23:26:53 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:56.519584 | orchestrator | 2025-03-22 23:26:56 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:56.521416 | orchestrator | 2025-03-22 23:26:56 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:56.522682 | orchestrator | 2025-03-22 23:26:56 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:26:59.574594 | orchestrator | 2025-03-22 23:26:56 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:26:59.574737 | orchestrator | 2025-03-22 23:26:59 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:26:59.574823 | orchestrator | 2025-03-22 23:26:59 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:26:59.574846 | orchestrator | 2025-03-22 23:26:59 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:02.634778 | orchestrator | 2025-03-22 23:26:59 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:02.634907 | orchestrator | 2025-03-22 23:27:02 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:02.636836 | orchestrator | 2025-03-22 23:27:02 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:02.639735 | orchestrator | 2025-03-22 23:27:02 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:02.642518 | orchestrator | 2025-03-22 23:27:02 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:05.698428 | orchestrator | 2025-03-22 23:27:05 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:05.700709 | orchestrator | 2025-03-22 23:27:05 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:05.702853 | orchestrator | 2025-03-22 23:27:05 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:08.758656 | orchestrator | 2025-03-22 23:27:05 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:08.758785 | orchestrator | 2025-03-22 23:27:08 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:08.759344 | orchestrator | 2025-03-22 23:27:08 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:08.760650 | orchestrator | 2025-03-22 23:27:08 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:11.824097 | orchestrator | 2025-03-22 23:27:08 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:11.824241 | orchestrator | 2025-03-22 23:27:11 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:11.825809 | orchestrator | 2025-03-22 23:27:11 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:11.827302 | orchestrator | 2025-03-22 23:27:11 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:14.881215 | orchestrator | 2025-03-22 23:27:11 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:14.881322 | orchestrator | 2025-03-22 23:27:14 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:14.884123 | orchestrator | 2025-03-22 23:27:14 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:14.885687 | orchestrator | 2025-03-22 23:27:14 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:17.945214 | orchestrator | 2025-03-22 23:27:14 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:17.945322 | orchestrator | 2025-03-22 23:27:17 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:17.946096 | orchestrator | 2025-03-22 23:27:17 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:17.948376 | orchestrator | 2025-03-22 23:27:17 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:20.998903 | orchestrator | 2025-03-22 23:27:17 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:20.999043 | orchestrator | 2025-03-22 23:27:20 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:21.000552 | orchestrator | 2025-03-22 23:27:20 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:21.003239 | orchestrator | 2025-03-22 23:27:21 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:24.066902 | orchestrator | 2025-03-22 23:27:21 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:24.067037 | orchestrator | 2025-03-22 23:27:24 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:24.069594 | orchestrator | 2025-03-22 23:27:24 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:24.072465 | orchestrator | 2025-03-22 23:27:24 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:27.124895 | orchestrator | 2025-03-22 23:27:24 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:27.125019 | orchestrator | 2025-03-22 23:27:27 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:27.125609 | orchestrator | 2025-03-22 23:27:27 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:27.127091 | orchestrator | 2025-03-22 23:27:27 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:30.186003 | orchestrator | 2025-03-22 23:27:27 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:30.186238 | orchestrator | 2025-03-22 23:27:30 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:30.186331 | orchestrator | 2025-03-22 23:27:30 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:30.195402 | orchestrator | 2025-03-22 23:27:30 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:33.244924 | orchestrator | 2025-03-22 23:27:30 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:33.245126 | orchestrator | 2025-03-22 23:27:33 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:33.245218 | orchestrator | 2025-03-22 23:27:33 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:33.245980 | orchestrator | 2025-03-22 23:27:33 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:36.314085 | orchestrator | 2025-03-22 23:27:33 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:36.314225 | orchestrator | 2025-03-22 23:27:36 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:36.315496 | orchestrator | 2025-03-22 23:27:36 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:36.317654 | orchestrator | 2025-03-22 23:27:36 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:39.381686 | orchestrator | 2025-03-22 23:27:36 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:39.381808 | orchestrator | 2025-03-22 23:27:39 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:39.387074 | orchestrator | 2025-03-22 23:27:39 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:39.388884 | orchestrator | 2025-03-22 23:27:39 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:42.438926 | orchestrator | 2025-03-22 23:27:39 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:42.439059 | orchestrator | 2025-03-22 23:27:42 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:42.439468 | orchestrator | 2025-03-22 23:27:42 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:42.441167 | orchestrator | 2025-03-22 23:27:42 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:42.441428 | orchestrator | 2025-03-22 23:27:42 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:45.498910 | orchestrator | 2025-03-22 23:27:45 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:45.500215 | orchestrator | 2025-03-22 23:27:45 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:45.506193 | orchestrator | 2025-03-22 23:27:45 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:48.560476 | orchestrator | 2025-03-22 23:27:45 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:48.560633 | orchestrator | 2025-03-22 23:27:48 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:48.561415 | orchestrator | 2025-03-22 23:27:48 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:48.562641 | orchestrator | 2025-03-22 23:27:48 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:51.610928 | orchestrator | 2025-03-22 23:27:48 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:51.611123 | orchestrator | 2025-03-22 23:27:51 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:51.611205 | orchestrator | 2025-03-22 23:27:51 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:51.615230 | orchestrator | 2025-03-22 23:27:51 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:51.615877 | orchestrator | 2025-03-22 23:27:51 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:54.674853 | orchestrator | 2025-03-22 23:27:54 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:54.675418 | orchestrator | 2025-03-22 23:27:54 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:54.676666 | orchestrator | 2025-03-22 23:27:54 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:27:57.717734 | orchestrator | 2025-03-22 23:27:54 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:27:57.717898 | orchestrator | 2025-03-22 23:27:57 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:27:57.717991 | orchestrator | 2025-03-22 23:27:57 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:27:57.719721 | orchestrator | 2025-03-22 23:27:57 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:00.775473 | orchestrator | 2025-03-22 23:27:57 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:00.775612 | orchestrator | 2025-03-22 23:28:00 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:03.835622 | orchestrator | 2025-03-22 23:28:00 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:03.835746 | orchestrator | 2025-03-22 23:28:00 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:03.835766 | orchestrator | 2025-03-22 23:28:00 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:03.835800 | orchestrator | 2025-03-22 23:28:03 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:03.839508 | orchestrator | 2025-03-22 23:28:03 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:03.840964 | orchestrator | 2025-03-22 23:28:03 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:03.843757 | orchestrator | 2025-03-22 23:28:03 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:06.892991 | orchestrator | 2025-03-22 23:28:06 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:06.894181 | orchestrator | 2025-03-22 23:28:06 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:06.896514 | orchestrator | 2025-03-22 23:28:06 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:09.949118 | orchestrator | 2025-03-22 23:28:06 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:09.949288 | orchestrator | 2025-03-22 23:28:09 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:09.949370 | orchestrator | 2025-03-22 23:28:09 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:09.950520 | orchestrator | 2025-03-22 23:28:09 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:13.015289 | orchestrator | 2025-03-22 23:28:09 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:13.015438 | orchestrator | 2025-03-22 23:28:13 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:13.016218 | orchestrator | 2025-03-22 23:28:13 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:13.016258 | orchestrator | 2025-03-22 23:28:13 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:16.083987 | orchestrator | 2025-03-22 23:28:13 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:16.084132 | orchestrator | 2025-03-22 23:28:16 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:16.084378 | orchestrator | 2025-03-22 23:28:16 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:16.086345 | orchestrator | 2025-03-22 23:28:16 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:19.164964 | orchestrator | 2025-03-22 23:28:16 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:19.165104 | orchestrator | 2025-03-22 23:28:19 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:19.174589 | orchestrator | 2025-03-22 23:28:19 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:19.174630 | orchestrator | 2025-03-22 23:28:19 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:22.218271 | orchestrator | 2025-03-22 23:28:19 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:22.218409 | orchestrator | 2025-03-22 23:28:22 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:22.220672 | orchestrator | 2025-03-22 23:28:22 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:22.223355 | orchestrator | 2025-03-22 23:28:22 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:25.269858 | orchestrator | 2025-03-22 23:28:22 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:25.269995 | orchestrator | 2025-03-22 23:28:25 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:25.270360 | orchestrator | 2025-03-22 23:28:25 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:25.271144 | orchestrator | 2025-03-22 23:28:25 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:28.335882 | orchestrator | 2025-03-22 23:28:25 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:28.336019 | orchestrator | 2025-03-22 23:28:28 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:28.337993 | orchestrator | 2025-03-22 23:28:28 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:31.405828 | orchestrator | 2025-03-22 23:28:28 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:31.405948 | orchestrator | 2025-03-22 23:28:28 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:31.405986 | orchestrator | 2025-03-22 23:28:31 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:31.407109 | orchestrator | 2025-03-22 23:28:31 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:31.408364 | orchestrator | 2025-03-22 23:28:31 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:34.465393 | orchestrator | 2025-03-22 23:28:31 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:34.465587 | orchestrator | 2025-03-22 23:28:34 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:34.467796 | orchestrator | 2025-03-22 23:28:34 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:34.470430 | orchestrator | 2025-03-22 23:28:34 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:37.512382 | orchestrator | 2025-03-22 23:28:34 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:37.512500 | orchestrator | 2025-03-22 23:28:37 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:37.512891 | orchestrator | 2025-03-22 23:28:37 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:37.514921 | orchestrator | 2025-03-22 23:28:37 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:40.568813 | orchestrator | 2025-03-22 23:28:37 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:40.568945 | orchestrator | 2025-03-22 23:28:40 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:40.569950 | orchestrator | 2025-03-22 23:28:40 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:40.571482 | orchestrator | 2025-03-22 23:28:40 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:43.626096 | orchestrator | 2025-03-22 23:28:40 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:43.626215 | orchestrator | 2025-03-22 23:28:43 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:43.628582 | orchestrator | 2025-03-22 23:28:43 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:43.631440 | orchestrator | 2025-03-22 23:28:43 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:46.676274 | orchestrator | 2025-03-22 23:28:43 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:46.676391 | orchestrator | 2025-03-22 23:28:46 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:46.677670 | orchestrator | 2025-03-22 23:28:46 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:46.679807 | orchestrator | 2025-03-22 23:28:46 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:46.679988 | orchestrator | 2025-03-22 23:28:46 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:49.716853 | orchestrator | 2025-03-22 23:28:49 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:49.719436 | orchestrator | 2025-03-22 23:28:49 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:49.720648 | orchestrator | 2025-03-22 23:28:49 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:49.720832 | orchestrator | 2025-03-22 23:28:49 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:52.767061 | orchestrator | 2025-03-22 23:28:52 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:52.768679 | orchestrator | 2025-03-22 23:28:52 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:52.769982 | orchestrator | 2025-03-22 23:28:52 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:55.815593 | orchestrator | 2025-03-22 23:28:52 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:55.815696 | orchestrator | 2025-03-22 23:28:55 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:55.817676 | orchestrator | 2025-03-22 23:28:55 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:55.819473 | orchestrator | 2025-03-22 23:28:55 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:55.819693 | orchestrator | 2025-03-22 23:28:55 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:28:58.871622 | orchestrator | 2025-03-22 23:28:58 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:28:58.872143 | orchestrator | 2025-03-22 23:28:58 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:28:58.874439 | orchestrator | 2025-03-22 23:28:58 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:28:58.874620 | orchestrator | 2025-03-22 23:28:58 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:01.923525 | orchestrator | 2025-03-22 23:29:01 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:01.924845 | orchestrator | 2025-03-22 23:29:01 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:29:01.924930 | orchestrator | 2025-03-22 23:29:01 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:04.977953 | orchestrator | 2025-03-22 23:29:01 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:04.978131 | orchestrator | 2025-03-22 23:29:04 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:04.979704 | orchestrator | 2025-03-22 23:29:04 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:29:04.981747 | orchestrator | 2025-03-22 23:29:04 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:08.029756 | orchestrator | 2025-03-22 23:29:04 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:08.029885 | orchestrator | 2025-03-22 23:29:08 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:08.030828 | orchestrator | 2025-03-22 23:29:08 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:29:08.032075 | orchestrator | 2025-03-22 23:29:08 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:11.096954 | orchestrator | 2025-03-22 23:29:08 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:11.097095 | orchestrator | 2025-03-22 23:29:11 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:11.100745 | orchestrator | 2025-03-22 23:29:11 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state STARTED 2025-03-22 23:29:11.101764 | orchestrator | 2025-03-22 23:29:11 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:14.151772 | orchestrator | 2025-03-22 23:29:11 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:14.151904 | orchestrator | 2025-03-22 23:29:14 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:14.165239 | orchestrator | 2025-03-22 23:29:14 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:14.165295 | orchestrator | 2025-03-22 23:29:14.165311 | orchestrator | None 2025-03-22 23:29:14.165326 | orchestrator | 2025-03-22 23:29:14.165341 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-03-22 23:29:14.165355 | orchestrator | 2025-03-22 23:29:14.165369 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-03-22 23:29:14.165384 | orchestrator | Saturday 22 March 2025 23:20:05 +0000 (0:00:00.642) 0:00:00.642 ******** 2025-03-22 23:29:14.165398 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.165414 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.165428 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.165469 | orchestrator | 2025-03-22 23:29:14.165484 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-03-22 23:29:14.165497 | orchestrator | Saturday 22 March 2025 23:20:06 +0000 (0:00:00.586) 0:00:01.229 ******** 2025-03-22 23:29:14.165513 | orchestrator | ok: [testbed-node-0] => (item=enable_loadbalancer_True) 2025-03-22 23:29:14.165527 | orchestrator | ok: [testbed-node-1] => (item=enable_loadbalancer_True) 2025-03-22 23:29:14.165566 | orchestrator | ok: [testbed-node-2] => (item=enable_loadbalancer_True) 2025-03-22 23:29:14.165581 | orchestrator | 2025-03-22 23:29:14.165595 | orchestrator | PLAY [Apply role loadbalancer] ************************************************* 2025-03-22 23:29:14.165608 | orchestrator | 2025-03-22 23:29:14.165622 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-03-22 23:29:14.165636 | orchestrator | Saturday 22 March 2025 23:20:06 +0000 (0:00:00.445) 0:00:01.674 ******** 2025-03-22 23:29:14.165650 | orchestrator | included: /ansible/roles/loadbalancer/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.165664 | orchestrator | 2025-03-22 23:29:14.165678 | orchestrator | TASK [loadbalancer : Check IPv6 support] *************************************** 2025-03-22 23:29:14.165692 | orchestrator | Saturday 22 March 2025 23:20:08 +0000 (0:00:01.532) 0:00:03.206 ******** 2025-03-22 23:29:14.165706 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.165719 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.165733 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.165747 | orchestrator | 2025-03-22 23:29:14.165761 | orchestrator | TASK [Setting sysctl values] *************************************************** 2025-03-22 23:29:14.165775 | orchestrator | Saturday 22 March 2025 23:20:10 +0000 (0:00:01.901) 0:00:05.108 ******** 2025-03-22 23:29:14.165789 | orchestrator | included: sysctl for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.165803 | orchestrator | 2025-03-22 23:29:14.165816 | orchestrator | TASK [sysctl : Check IPv6 support] ********************************************* 2025-03-22 23:29:14.165831 | orchestrator | Saturday 22 March 2025 23:20:11 +0000 (0:00:01.662) 0:00:06.771 ******** 2025-03-22 23:29:14.165846 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.165861 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.165876 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.165892 | orchestrator | 2025-03-22 23:29:14.165907 | orchestrator | TASK [sysctl : Setting sysctl values] ****************************************** 2025-03-22 23:29:14.165923 | orchestrator | Saturday 22 March 2025 23:20:13 +0000 (0:00:01.490) 0:00:08.261 ******** 2025-03-22 23:29:14.165937 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-03-22 23:29:14.165970 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-03-22 23:29:14.165986 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-03-22 23:29:14.166001 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-03-22 23:29:14.166068 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-03-22 23:29:14.166085 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-03-22 23:29:14.166101 | orchestrator | ok: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-03-22 23:29:14.166118 | orchestrator | ok: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-03-22 23:29:14.166134 | orchestrator | ok: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-03-22 23:29:14.166149 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-03-22 23:29:14.166165 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-03-22 23:29:14.166181 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-03-22 23:29:14.166206 | orchestrator | 2025-03-22 23:29:14.166221 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-03-22 23:29:14.166241 | orchestrator | Saturday 22 March 2025 23:20:16 +0000 (0:00:03.348) 0:00:11.610 ******** 2025-03-22 23:29:14.166255 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-03-22 23:29:14.166269 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-03-22 23:29:14.166283 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-03-22 23:29:14.166297 | orchestrator | 2025-03-22 23:29:14.166311 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-03-22 23:29:14.166325 | orchestrator | Saturday 22 March 2025 23:20:17 +0000 (0:00:01.155) 0:00:12.766 ******** 2025-03-22 23:29:14.166339 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-03-22 23:29:14.166358 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-03-22 23:29:14.166372 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-03-22 23:29:14.166386 | orchestrator | 2025-03-22 23:29:14.166400 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-03-22 23:29:14.166414 | orchestrator | Saturday 22 March 2025 23:20:20 +0000 (0:00:02.777) 0:00:15.543 ******** 2025-03-22 23:29:14.166428 | orchestrator | skipping: [testbed-node-0] => (item=ip_vs)  2025-03-22 23:29:14.166443 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.166466 | orchestrator | skipping: [testbed-node-1] => (item=ip_vs)  2025-03-22 23:29:14.166480 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.166494 | orchestrator | skipping: [testbed-node-2] => (item=ip_vs)  2025-03-22 23:29:14.166508 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.166522 | orchestrator | 2025-03-22 23:29:14.166569 | orchestrator | TASK [loadbalancer : Ensuring config directories exist] ************************ 2025-03-22 23:29:14.166584 | orchestrator | Saturday 22 March 2025 23:20:21 +0000 (0:00:01.206) 0:00:16.750 ******** 2025-03-22 23:29:14.166601 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.166642 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.166658 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.166673 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.166696 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.166903 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.166922 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.166938 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.166954 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.166968 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.166991 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.167007 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.167021 | orchestrator | 2025-03-22 23:29:14.167035 | orchestrator | TASK [loadbalancer : Ensuring haproxy service config subdir exists] ************ 2025-03-22 23:29:14.167049 | orchestrator | Saturday 22 March 2025 23:20:25 +0000 (0:00:03.443) 0:00:20.194 ******** 2025-03-22 23:29:14.167063 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.167078 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.167092 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.167106 | orchestrator | 2025-03-22 23:29:14.167126 | orchestrator | TASK [loadbalancer : Ensuring proxysql service config subdirectories exist] **** 2025-03-22 23:29:14.167141 | orchestrator | Saturday 22 March 2025 23:20:30 +0000 (0:00:05.727) 0:00:25.921 ******** 2025-03-22 23:29:14.167155 | orchestrator | changed: [testbed-node-0] => (item=users) 2025-03-22 23:29:14.167169 | orchestrator | changed: [testbed-node-2] => (item=users) 2025-03-22 23:29:14.167183 | orchestrator | changed: [testbed-node-1] => (item=users) 2025-03-22 23:29:14.167197 | orchestrator | changed: [testbed-node-0] => (item=rules) 2025-03-22 23:29:14.167211 | orchestrator | changed: [testbed-node-2] => (item=rules) 2025-03-22 23:29:14.167225 | orchestrator | changed: [testbed-node-1] => (item=rules) 2025-03-22 23:29:14.167239 | orchestrator | 2025-03-22 23:29:14.167252 | orchestrator | TASK [loadbalancer : Ensuring keepalived checks subdir exists] ***************** 2025-03-22 23:29:14.167266 | orchestrator | Saturday 22 March 2025 23:20:37 +0000 (0:00:06.433) 0:00:32.355 ******** 2025-03-22 23:29:14.167280 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.167294 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.167308 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.167322 | orchestrator | 2025-03-22 23:29:14.167336 | orchestrator | TASK [loadbalancer : Remove mariadb.cfg if proxysql enabled] ******************* 2025-03-22 23:29:14.167350 | orchestrator | Saturday 22 March 2025 23:20:39 +0000 (0:00:02.403) 0:00:34.759 ******** 2025-03-22 23:29:14.167364 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.167378 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.167392 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.167405 | orchestrator | 2025-03-22 23:29:14.167420 | orchestrator | TASK [loadbalancer : Removing checks for services which are disabled] ********** 2025-03-22 23:29:14.167434 | orchestrator | Saturday 22 March 2025 23:20:42 +0000 (0:00:02.435) 0:00:37.194 ******** 2025-03-22 23:29:14.167448 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.167471 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.167486 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.167501 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.167589 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.167666 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.167683 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.167707 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.167722 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.167736 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.167751 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.167766 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.167780 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.167802 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.167817 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.167831 | orchestrator | 2025-03-22 23:29:14.167845 | orchestrator | TASK [loadbalancer : Copying checks for services which are enabled] ************ 2025-03-22 23:29:14.167866 | orchestrator | Saturday 22 March 2025 23:20:46 +0000 (0:00:04.278) 0:00:41.473 ******** 2025-03-22 23:29:14.167881 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.167895 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.167909 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.167924 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.167945 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.167961 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.167976 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.167997 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.168012 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.168026 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.168041 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.168065 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.168080 | orchestrator | 2025-03-22 23:29:14.168094 | orchestrator | TASK [loadbalancer : Copying over config.json files for services] ************** 2025-03-22 23:29:14.168114 | orchestrator | Saturday 22 March 2025 23:20:53 +0000 (0:00:07.046) 0:00:48.519 ******** 2025-03-22 23:29:14.168154 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.168172 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.168187 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.168202 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.168216 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.168239 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/pro2025-03-22 23:29:14 | INFO  | Task 4d6b7c8d-bb6f-4d2c-a72c-c1ba96763ff2 is in state SUCCESS 2025-03-22 23:29:14.169086 | orchestrator | xysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.169141 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.169158 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.169174 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.169188 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.169203 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.169218 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.169247 | orchestrator | 2025-03-22 23:29:14.169262 | orchestrator | TASK [loadbalancer : Copying over haproxy.cfg] ********************************* 2025-03-22 23:29:14.169276 | orchestrator | Saturday 22 March 2025 23:20:58 +0000 (0:00:04.637) 0:00:53.157 ******** 2025-03-22 23:29:14.169297 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-03-22 23:29:14.169312 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-03-22 23:29:14.169326 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-03-22 23:29:14.169339 | orchestrator | 2025-03-22 23:29:14.169353 | orchestrator | TASK [loadbalancer : Copying over proxysql config] ***************************** 2025-03-22 23:29:14.169366 | orchestrator | Saturday 22 March 2025 23:21:01 +0000 (0:00:03.402) 0:00:56.559 ******** 2025-03-22 23:29:14.169380 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-03-22 23:29:14.169393 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-03-22 23:29:14.169407 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-03-22 23:29:14.169420 | orchestrator | 2025-03-22 23:29:14.169434 | orchestrator | TASK [loadbalancer : Copying over haproxy single external frontend config] ***** 2025-03-22 23:29:14.169482 | orchestrator | Saturday 22 March 2025 23:21:07 +0000 (0:00:06.054) 0:01:02.614 ******** 2025-03-22 23:29:14.169498 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.169596 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.169616 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.169629 | orchestrator | 2025-03-22 23:29:14.169641 | orchestrator | TASK [loadbalancer : Copying over custom haproxy services configuration] ******* 2025-03-22 23:29:14.169654 | orchestrator | Saturday 22 March 2025 23:21:11 +0000 (0:00:04.012) 0:01:06.626 ******** 2025-03-22 23:29:14.169667 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-03-22 23:29:14.169683 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-03-22 23:29:14.169696 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-03-22 23:29:14.169710 | orchestrator | 2025-03-22 23:29:14.169794 | orchestrator | TASK [loadbalancer : Copying over keepalived.conf] ***************************** 2025-03-22 23:29:14.169809 | orchestrator | Saturday 22 March 2025 23:21:14 +0000 (0:00:02.926) 0:01:09.553 ******** 2025-03-22 23:29:14.169823 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-03-22 23:29:14.169837 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-03-22 23:29:14.169851 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-03-22 23:29:14.169864 | orchestrator | 2025-03-22 23:29:14.169878 | orchestrator | TASK [loadbalancer : Copying over haproxy.pem] ********************************* 2025-03-22 23:29:14.169891 | orchestrator | Saturday 22 March 2025 23:21:17 +0000 (0:00:03.200) 0:01:12.753 ******** 2025-03-22 23:29:14.169905 | orchestrator | changed: [testbed-node-1] => (item=haproxy.pem) 2025-03-22 23:29:14.169926 | orchestrator | changed: [testbed-node-0] => (item=haproxy.pem) 2025-03-22 23:29:14.169940 | orchestrator | changed: [testbed-node-2] => (item=haproxy.pem) 2025-03-22 23:29:14.169954 | orchestrator | 2025-03-22 23:29:14.169967 | orchestrator | TASK [loadbalancer : Copying over haproxy-internal.pem] ************************ 2025-03-22 23:29:14.169981 | orchestrator | Saturday 22 March 2025 23:21:21 +0000 (0:00:03.493) 0:01:16.247 ******** 2025-03-22 23:29:14.169995 | orchestrator | changed: [testbed-node-1] => (item=haproxy-internal.pem) 2025-03-22 23:29:14.170009 | orchestrator | changed: [testbed-node-0] => (item=haproxy-internal.pem) 2025-03-22 23:29:14.170064 | orchestrator | changed: [testbed-node-2] => (item=haproxy-internal.pem) 2025-03-22 23:29:14.170077 | orchestrator | 2025-03-22 23:29:14.170095 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-03-22 23:29:14.170108 | orchestrator | Saturday 22 March 2025 23:21:23 +0000 (0:00:02.736) 0:01:18.984 ******** 2025-03-22 23:29:14.170120 | orchestrator | included: /ansible/roles/loadbalancer/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.170131 | orchestrator | 2025-03-22 23:29:14.170152 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over extra CA certificates] *** 2025-03-22 23:29:14.170163 | orchestrator | Saturday 22 March 2025 23:21:25 +0000 (0:00:01.183) 0:01:20.167 ******** 2025-03-22 23:29:14.170177 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.170197 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.170230 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.170242 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.170261 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.170279 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.170341 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.170363 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.170375 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.170385 | orchestrator | 2025-03-22 23:29:14.170396 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS certificate] *** 2025-03-22 23:29:14.170406 | orchestrator | Saturday 22 March 2025 23:21:29 +0000 (0:00:04.177) 0:01:24.345 ******** 2025-03-22 23:29:14.170416 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.170468 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.170489 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.170504 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.170515 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.170526 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.170559 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.170570 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.170581 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.170592 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.170602 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.170619 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.170629 | orchestrator | 2025-03-22 23:29:14.170640 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS key] *** 2025-03-22 23:29:14.170650 | orchestrator | Saturday 22 March 2025 23:21:30 +0000 (0:00:01.312) 0:01:25.657 ******** 2025-03-22 23:29:14.170665 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.170676 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.170691 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.170702 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.170712 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.170723 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.170733 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.170748 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.170759 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-03-22 23:29:14.170774 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-03-22 23:29:14.170794 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-03-22 23:29:14.170806 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.170837 | orchestrator | 2025-03-22 23:29:14.170848 | orchestrator | TASK [loadbalancer : Copying over haproxy start script] ************************ 2025-03-22 23:29:14.170863 | orchestrator | Saturday 22 March 2025 23:21:31 +0000 (0:00:01.325) 0:01:26.983 ******** 2025-03-22 23:29:14.170873 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-03-22 23:29:14.170884 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-03-22 23:29:14.170894 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-03-22 23:29:14.170904 | orchestrator | 2025-03-22 23:29:14.170914 | orchestrator | TASK [loadbalancer : Copying over proxysql start script] *********************** 2025-03-22 23:29:14.170925 | orchestrator | Saturday 22 March 2025 23:21:34 +0000 (0:00:02.326) 0:01:29.309 ******** 2025-03-22 23:29:14.170965 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-03-22 23:29:14.170976 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-03-22 23:29:14.170986 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-03-22 23:29:14.170996 | orchestrator | 2025-03-22 23:29:14.171007 | orchestrator | TASK [loadbalancer : Copying files for haproxy-ssh] **************************** 2025-03-22 23:29:14.171017 | orchestrator | Saturday 22 March 2025 23:21:38 +0000 (0:00:03.847) 0:01:33.157 ******** 2025-03-22 23:29:14.171099 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-03-22 23:29:14.171111 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-03-22 23:29:14.171122 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-03-22 23:29:14.171132 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-03-22 23:29:14.171143 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.171153 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-03-22 23:29:14.171163 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.171174 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-03-22 23:29:14.171184 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.171194 | orchestrator | 2025-03-22 23:29:14.171205 | orchestrator | TASK [loadbalancer : Check loadbalancer containers] **************************** 2025-03-22 23:29:14.171215 | orchestrator | Saturday 22 March 2025 23:21:40 +0000 (0:00:02.265) 0:01:35.423 ******** 2025-03-22 23:29:14.171226 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.171237 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.171248 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/haproxy:2.4.24.20241206', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-03-22 23:29:14.171265 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.171280 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.171298 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/proxysql:2.6.6.20241206', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-03-22 23:29:14.171308 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.171319 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.171330 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.171346 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.171357 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keepalived:2.2.4.20241206', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-03-22 23:29:14.171374 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/haproxy-ssh:8.9.20241206', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7', '__omit_place_holder__c75089e97c19782832f07568cea4aef62f4e75f7'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-03-22 23:29:14.171384 | orchestrator | 2025-03-22 23:29:14.171395 | orchestrator | TASK [include_role : aodh] ***************************************************** 2025-03-22 23:29:14.171405 | orchestrator | Saturday 22 March 2025 23:21:43 +0000 (0:00:03.328) 0:01:38.752 ******** 2025-03-22 23:29:14.171416 | orchestrator | included: aodh for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.171426 | orchestrator | 2025-03-22 23:29:14.171436 | orchestrator | TASK [haproxy-config : Copying over aodh haproxy config] *********************** 2025-03-22 23:29:14.171447 | orchestrator | Saturday 22 March 2025 23:21:45 +0000 (0:00:01.258) 0:01:40.010 ******** 2025-03-22 23:29:14.171457 | orchestrator | changed: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-03-22 23:29:14.171489 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.171501 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171593 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171616 | orchestrator | changed: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-03-22 23:29:14.171628 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.171638 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171649 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171660 | orchestrator | changed: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-03-22 23:29:14.171687 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.171705 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171716 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171726 | orchestrator | 2025-03-22 23:29:14.171737 | orchestrator | TASK [haproxy-config : Add configuration for aodh when using single external frontend] *** 2025-03-22 23:29:14.171787 | orchestrator | Saturday 22 March 2025 23:21:52 +0000 (0:00:07.608) 0:01:47.619 ******** 2025-03-22 23:29:14.171799 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-03-22 23:29:14.171810 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.171820 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171851 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171863 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.171874 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-03-22 23:29:14.171885 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.171895 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171906 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171916 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.171948 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-api:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-03-22 23:29:14.171973 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-evaluator:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.171985 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.171995 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/aodh-notifier:18.0.1.20241206', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.172006 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.172016 | orchestrator | 2025-03-22 23:29:14.172027 | orchestrator | TASK [haproxy-config : Configuring firewall for aodh] ************************** 2025-03-22 23:29:14.172037 | orchestrator | Saturday 22 March 2025 23:21:53 +0000 (0:00:01.191) 0:01:48.810 ******** 2025-03-22 23:29:14.172048 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-03-22 23:29:14.172060 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-03-22 23:29:14.172070 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.172081 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-03-22 23:29:14.172091 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-03-22 23:29:14.172102 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-03-22 23:29:14.172112 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.172127 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-03-22 23:29:14.172137 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.172147 | orchestrator | 2025-03-22 23:29:14.172163 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL users config] *************** 2025-03-22 23:29:14.172173 | orchestrator | Saturday 22 March 2025 23:21:55 +0000 (0:00:02.055) 0:01:50.865 ******** 2025-03-22 23:29:14.172183 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.172193 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.172203 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.172213 | orchestrator | 2025-03-22 23:29:14.172223 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL rules config] *************** 2025-03-22 23:29:14.172234 | orchestrator | Saturday 22 March 2025 23:21:57 +0000 (0:00:01.699) 0:01:52.565 ******** 2025-03-22 23:29:14.172244 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.172254 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.172264 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.172274 | orchestrator | 2025-03-22 23:29:14.172284 | orchestrator | TASK [include_role : barbican] ************************************************* 2025-03-22 23:29:14.172294 | orchestrator | Saturday 22 March 2025 23:22:01 +0000 (0:00:03.568) 0:01:56.134 ******** 2025-03-22 23:29:14.172304 | orchestrator | included: barbican for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.172314 | orchestrator | 2025-03-22 23:29:14.172324 | orchestrator | TASK [haproxy-config : Copying over barbican haproxy config] ******************* 2025-03-22 23:29:14.172335 | orchestrator | Saturday 22 March 2025 23:22:02 +0000 (0:00:00.904) 0:01:57.038 ******** 2025-03-22 23:29:14.172351 | orchestrator | changed: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.172371 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.172438 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.172450 | orchestrator | changed: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.172471 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.172488 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.172499 | orchestrator | changed: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.172596 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.172610 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.172630 | orchestrator | 2025-03-22 23:29:14.172640 | orchestrator | TASK [haproxy-config : Add configuration for barbican when using single external frontend] *** 2025-03-22 23:29:14.172651 | orchestrator | Saturday 22 March 2025 23:22:08 +0000 (0:00:06.360) 0:02:03.399 ******** 2025-03-22 23:29:14.172661 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.172679 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.174160 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.174257 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.174318 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.174399 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.174417 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.174432 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.174464 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-api:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.174480 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-keystone-listener:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.174495 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'environment': {'CS_AUTH_KEYS': ''}, 'image': 'registry.osism.tech/kolla/release/barbican-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.174510 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.174525 | orchestrator | 2025-03-22 23:29:14.174577 | orchestrator | TASK [haproxy-config : Configuring firewall for barbican] ********************** 2025-03-22 23:29:14.174593 | orchestrator | Saturday 22 March 2025 23:22:10 +0000 (0:00:02.073) 0:02:05.473 ******** 2025-03-22 23:29:14.174607 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-22 23:29:14.174622 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-22 23:29:14.174637 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.174652 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-22 23:29:14.174673 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-22 23:29:14.174688 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.174702 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-22 23:29:14.174716 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-03-22 23:29:14.174730 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.174744 | orchestrator | 2025-03-22 23:29:14.174758 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL users config] *********** 2025-03-22 23:29:14.174773 | orchestrator | Saturday 22 March 2025 23:22:11 +0000 (0:00:01.297) 0:02:06.770 ******** 2025-03-22 23:29:14.174786 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.174800 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.174815 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.174829 | orchestrator | 2025-03-22 23:29:14.174842 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL rules config] *********** 2025-03-22 23:29:14.174856 | orchestrator | Saturday 22 March 2025 23:22:13 +0000 (0:00:01.755) 0:02:08.526 ******** 2025-03-22 23:29:14.174883 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.174898 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.174912 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.174926 | orchestrator | 2025-03-22 23:29:14.174941 | orchestrator | TASK [include_role : blazar] *************************************************** 2025-03-22 23:29:14.174955 | orchestrator | Saturday 22 March 2025 23:22:16 +0000 (0:00:03.419) 0:02:11.946 ******** 2025-03-22 23:29:14.174969 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.174982 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.174997 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.175011 | orchestrator | 2025-03-22 23:29:14.175032 | orchestrator | TASK [include_role : ceph-rgw] ************************************************* 2025-03-22 23:29:14.175046 | orchestrator | Saturday 22 March 2025 23:22:17 +0000 (0:00:00.357) 0:02:12.303 ******** 2025-03-22 23:29:14.175060 | orchestrator | included: ceph-rgw for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.175074 | orchestrator | 2025-03-22 23:29:14.175087 | orchestrator | TASK [haproxy-config : Copying over ceph-rgw haproxy config] ******************* 2025-03-22 23:29:14.175101 | orchestrator | Saturday 22 March 2025 23:22:18 +0000 (0:00:01.064) 0:02:13.368 ******** 2025-03-22 23:29:14.175130 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-03-22 23:29:14.175154 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-03-22 23:29:14.175169 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-03-22 23:29:14.175184 | orchestrator | 2025-03-22 23:29:14.175198 | orchestrator | TASK [haproxy-config : Add configuration for ceph-rgw when using single external frontend] *** 2025-03-22 23:29:14.175212 | orchestrator | Saturday 22 March 2025 23:22:22 +0000 (0:00:03.910) 0:02:17.278 ******** 2025-03-22 23:29:14.175235 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-03-22 23:29:14.175250 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.175284 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-03-22 23:29:14.175307 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.175321 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-03-22 23:29:14.175336 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.175350 | orchestrator | 2025-03-22 23:29:14.175364 | orchestrator | TASK [haproxy-config : Configuring firewall for ceph-rgw] ********************** 2025-03-22 23:29:14.175379 | orchestrator | Saturday 22 March 2025 23:22:24 +0000 (0:00:02.322) 0:02:19.601 ******** 2025-03-22 23:29:14.175394 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-22 23:29:14.175410 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-22 23:29:14.175425 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.175440 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-22 23:29:14.175454 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-22 23:29:14.175469 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.175483 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-22 23:29:14.175512 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-03-22 23:29:14.175554 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.175570 | orchestrator | 2025-03-22 23:29:14.175584 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL users config] *********** 2025-03-22 23:29:14.175598 | orchestrator | Saturday 22 March 2025 23:22:27 +0000 (0:00:02.964) 0:02:22.566 ******** 2025-03-22 23:29:14.175612 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.175626 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.175640 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.175654 | orchestrator | 2025-03-22 23:29:14.175668 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL rules config] *********** 2025-03-22 23:29:14.175683 | orchestrator | Saturday 22 March 2025 23:22:28 +0000 (0:00:00.954) 0:02:23.521 ******** 2025-03-22 23:29:14.175697 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.175710 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.175724 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.175739 | orchestrator | 2025-03-22 23:29:14.175753 | orchestrator | TASK [include_role : cinder] *************************************************** 2025-03-22 23:29:14.175767 | orchestrator | Saturday 22 March 2025 23:22:30 +0000 (0:00:01.874) 0:02:25.395 ******** 2025-03-22 23:29:14.175781 | orchestrator | included: cinder for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.175795 | orchestrator | 2025-03-22 23:29:14.175809 | orchestrator | TASK [haproxy-config : Copying over cinder haproxy config] ********************* 2025-03-22 23:29:14.175823 | orchestrator | Saturday 22 March 2025 23:22:31 +0000 (0:00:00.902) 0:02:26.298 ******** 2025-03-22 23:29:14.175838 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.175853 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.175880 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.175910 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.175926 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.175941 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.175956 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.175980 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176008 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.176024 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176039 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176067 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176082 | orchestrator | 2025-03-22 23:29:14.176097 | orchestrator | TASK [haproxy-config : Add configuration for cinder when using single external frontend] *** 2025-03-22 23:29:14.176111 | orchestrator | Saturday 22 March 2025 23:22:36 +0000 (0:00:05.669) 0:02:31.967 ******** 2025-03-22 23:29:14.176125 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.176170 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176193 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176219 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176234 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.176249 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.176264 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176300 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176331 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176347 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.176362 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-api:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.176376 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-scheduler:24.2.1.20241206', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176391 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-volume:24.2.1.20241206', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3.10/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176421 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/cinder-backup:24.2.1.20241206', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.176436 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.176450 | orchestrator | 2025-03-22 23:29:14.176465 | orchestrator | TASK [haproxy-config : Configuring firewall for cinder] ************************ 2025-03-22 23:29:14.176484 | orchestrator | Saturday 22 March 2025 23:22:38 +0000 (0:00:01.284) 0:02:33.252 ******** 2025-03-22 23:29:14.176498 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-22 23:29:14.176519 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-22 23:29:14.176551 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.176567 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-22 23:29:14.176581 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-22 23:29:14.176595 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.176610 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-22 23:29:14.176624 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-03-22 23:29:14.176638 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.176652 | orchestrator | 2025-03-22 23:29:14.176666 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL users config] ************* 2025-03-22 23:29:14.176680 | orchestrator | Saturday 22 March 2025 23:22:40 +0000 (0:00:02.408) 0:02:35.660 ******** 2025-03-22 23:29:14.176695 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.176709 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.176723 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.176736 | orchestrator | 2025-03-22 23:29:14.176750 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL rules config] ************* 2025-03-22 23:29:14.176765 | orchestrator | Saturday 22 March 2025 23:22:42 +0000 (0:00:01.682) 0:02:37.342 ******** 2025-03-22 23:29:14.176779 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.176792 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.176806 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.176820 | orchestrator | 2025-03-22 23:29:14.176834 | orchestrator | TASK [include_role : cloudkitty] *********************************************** 2025-03-22 23:29:14.176856 | orchestrator | Saturday 22 March 2025 23:22:45 +0000 (0:00:02.917) 0:02:40.259 ******** 2025-03-22 23:29:14.176870 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.176884 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.176898 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.176912 | orchestrator | 2025-03-22 23:29:14.176925 | orchestrator | TASK [include_role : cyborg] *************************************************** 2025-03-22 23:29:14.176939 | orchestrator | Saturday 22 March 2025 23:22:45 +0000 (0:00:00.357) 0:02:40.617 ******** 2025-03-22 23:29:14.176953 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.176967 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.176987 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.177001 | orchestrator | 2025-03-22 23:29:14.177015 | orchestrator | TASK [include_role : designate] ************************************************ 2025-03-22 23:29:14.177029 | orchestrator | Saturday 22 March 2025 23:22:46 +0000 (0:00:00.580) 0:02:41.198 ******** 2025-03-22 23:29:14.177043 | orchestrator | included: designate for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.177057 | orchestrator | 2025-03-22 23:29:14.177071 | orchestrator | TASK [haproxy-config : Copying over designate haproxy config] ****************** 2025-03-22 23:29:14.177085 | orchestrator | Saturday 22 March 2025 23:22:47 +0000 (0:00:01.200) 0:02:42.398 ******** 2025-03-22 23:29:14.177100 | orchestrator | changed: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-03-22 23:29:14.177121 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-22 23:29:14.177136 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177151 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177173 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177188 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177212 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177228 | orchestrator | changed: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-03-22 23:29:14.177249 | orchestrator | changed: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-03-22 23:29:14.177273 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-22 23:29:14.177295 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177310 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-22 23:29:14.177325 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177339 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177365 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177380 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177429 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177447 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177462 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177477 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177492 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177506 | orchestrator | 2025-03-22 23:29:14.177526 | orchestrator | TASK [haproxy-config : Add configuration for designate when using single external frontend] *** 2025-03-22 23:29:14.177574 | orchestrator | Saturday 22 March 2025 23:22:53 +0000 (0:00:06.216) 0:02:48.614 ******** 2025-03-22 23:29:14.177600 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-03-22 23:29:14.177623 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-03-22 23:29:14.177638 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-22 23:29:14.177653 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177667 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-22 23:29:14.177689 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177713 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177735 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177750 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177764 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177779 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177793 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177808 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.177838 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177877 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177892 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.177907 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-api:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-03-22 23:29:14.177921 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-backend-bind9:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-03-22 23:29:14.177935 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-central:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177959 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-mdns:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.177981 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-producer:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.178003 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/designate-worker:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.178051 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/release/designate-sink:18.0.1.20241206', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.178069 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.178084 | orchestrator | 2025-03-22 23:29:14.178098 | orchestrator | TASK [haproxy-config : Configuring firewall for designate] ********************* 2025-03-22 23:29:14.178113 | orchestrator | Saturday 22 March 2025 23:22:55 +0000 (0:00:01.848) 0:02:50.462 ******** 2025-03-22 23:29:14.178127 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-03-22 23:29:14.178141 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-03-22 23:29:14.178156 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.178170 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-03-22 23:29:14.178184 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-03-22 23:29:14.178198 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.178212 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-03-22 23:29:14.178232 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-03-22 23:29:14.178246 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.178260 | orchestrator | 2025-03-22 23:29:14.178274 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL users config] ********** 2025-03-22 23:29:14.178288 | orchestrator | Saturday 22 March 2025 23:22:57 +0000 (0:00:01.918) 0:02:52.381 ******** 2025-03-22 23:29:14.178302 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.178315 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.178329 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.178343 | orchestrator | 2025-03-22 23:29:14.178357 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL rules config] ********** 2025-03-22 23:29:14.178377 | orchestrator | Saturday 22 March 2025 23:22:59 +0000 (0:00:01.787) 0:02:54.169 ******** 2025-03-22 23:29:14.178392 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.178405 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.178419 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.178433 | orchestrator | 2025-03-22 23:29:14.178446 | orchestrator | TASK [include_role : etcd] ***************************************************** 2025-03-22 23:29:14.178460 | orchestrator | Saturday 22 March 2025 23:23:01 +0000 (0:00:02.612) 0:02:56.781 ******** 2025-03-22 23:29:14.178474 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.178487 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.178501 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.178515 | orchestrator | 2025-03-22 23:29:14.178549 | orchestrator | TASK [include_role : glance] *************************************************** 2025-03-22 23:29:14.178571 | orchestrator | Saturday 22 March 2025 23:23:02 +0000 (0:00:00.595) 0:02:57.377 ******** 2025-03-22 23:29:14.178586 | orchestrator | included: glance for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.178600 | orchestrator | 2025-03-22 23:29:14.178613 | orchestrator | TASK [haproxy-config : Copying over glance haproxy config] ********************* 2025-03-22 23:29:14.178627 | orchestrator | Saturday 22 March 2025 23:23:03 +0000 (0:00:01.405) 0:02:58.783 ******** 2025-03-22 23:29:14.178653 | orchestrator | changed: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-03-22 23:29:14.178669 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.178708 | orchestrator | changed: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-03-22 23:29:14.178726 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.178788 | orchestrator | changed: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-03-22 23:29:14.178805 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.178837 | orchestrator | 2025-03-22 23:29:14.178852 | orchestrator | TASK [haproxy-config : Add configuration for glance when using single external frontend] *** 2025-03-22 23:29:14.178866 | orchestrator | Saturday 22 March 2025 23:23:13 +0000 (0:00:09.771) 0:03:08.554 ******** 2025-03-22 23:29:14.178888 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-03-22 23:29:14.178904 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.178935 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.178958 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-03-22 23:29:14.178973 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.178997 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.179012 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/glance-api:28.1.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-03-22 23:29:14.179051 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/glance-tls-proxy:28.1.1.20241206', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.179068 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.179082 | orchestrator | 2025-03-22 23:29:14.179096 | orchestrator | TASK [haproxy-config : Configuring firewall for glance] ************************ 2025-03-22 23:29:14.179209 | orchestrator | Saturday 22 March 2025 23:23:20 +0000 (0:00:06.911) 0:03:15.466 ******** 2025-03-22 23:29:14.179225 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-22 23:29:14.179263 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-22 23:29:14.179280 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.179295 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-22 23:29:14.179318 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-22 23:29:14.179333 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.179348 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-22 23:29:14.179363 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-03-22 23:29:14.179378 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.179392 | orchestrator | 2025-03-22 23:29:14.179406 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL users config] ************* 2025-03-22 23:29:14.179425 | orchestrator | Saturday 22 March 2025 23:23:28 +0000 (0:00:08.512) 0:03:23.979 ******** 2025-03-22 23:29:14.179439 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.179453 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.179468 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.179482 | orchestrator | 2025-03-22 23:29:14.179496 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL rules config] ************* 2025-03-22 23:29:14.179510 | orchestrator | Saturday 22 March 2025 23:23:30 +0000 (0:00:01.750) 0:03:25.729 ******** 2025-03-22 23:29:14.179524 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.179557 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.179572 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.179586 | orchestrator | 2025-03-22 23:29:14.179607 | orchestrator | TASK [include_role : gnocchi] ************************************************** 2025-03-22 23:29:14.179621 | orchestrator | Saturday 22 March 2025 23:23:33 +0000 (0:00:02.585) 0:03:28.314 ******** 2025-03-22 23:29:14.179635 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.179649 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.179663 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.179677 | orchestrator | 2025-03-22 23:29:14.179691 | orchestrator | TASK [include_role : grafana] ************************************************** 2025-03-22 23:29:14.179705 | orchestrator | Saturday 22 March 2025 23:23:33 +0000 (0:00:00.539) 0:03:28.854 ******** 2025-03-22 23:29:14.179719 | orchestrator | included: grafana for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.179733 | orchestrator | 2025-03-22 23:29:14.179746 | orchestrator | TASK [haproxy-config : Copying over grafana haproxy config] ******************** 2025-03-22 23:29:14.179760 | orchestrator | Saturday 22 March 2025 23:23:35 +0000 (0:00:01.289) 0:03:30.143 ******** 2025-03-22 23:29:14.179774 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-03-22 23:29:14.179790 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-03-22 23:29:14.179813 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-03-22 23:29:14.179828 | orchestrator | 2025-03-22 23:29:14.179842 | orchestrator | TASK [haproxy-config : Add configuration for grafana when using single external frontend] *** 2025-03-22 23:29:14.179856 | orchestrator | Saturday 22 March 2025 23:23:39 +0000 (0:00:04.371) 0:03:34.514 ******** 2025-03-22 23:29:14.179871 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-03-22 23:29:14.179893 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-03-22 23:29:14.179908 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.179922 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.179936 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/grafana:11.4.0.20241206', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-03-22 23:29:14.179950 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.179964 | orchestrator | 2025-03-22 23:29:14.179978 | orchestrator | TASK [haproxy-config : Configuring firewall for grafana] *********************** 2025-03-22 23:29:14.179992 | orchestrator | Saturday 22 March 2025 23:23:39 +0000 (0:00:00.478) 0:03:34.993 ******** 2025-03-22 23:29:14.180006 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-03-22 23:29:14.180020 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-03-22 23:29:14.180034 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.180048 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-03-22 23:29:14.180062 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-03-22 23:29:14.180076 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.180090 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-03-22 23:29:14.180118 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-03-22 23:29:14.180133 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.180147 | orchestrator | 2025-03-22 23:29:14.180161 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL users config] ************ 2025-03-22 23:29:14.180175 | orchestrator | Saturday 22 March 2025 23:23:41 +0000 (0:00:01.073) 0:03:36.066 ******** 2025-03-22 23:29:14.180189 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.180203 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.180217 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.180231 | orchestrator | 2025-03-22 23:29:14.180245 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL rules config] ************ 2025-03-22 23:29:14.180270 | orchestrator | Saturday 22 March 2025 23:23:42 +0000 (0:00:01.338) 0:03:37.405 ******** 2025-03-22 23:29:14.180284 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.180298 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.180313 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.180327 | orchestrator | 2025-03-22 23:29:14.180341 | orchestrator | TASK [include_role : heat] ***************************************************** 2025-03-22 23:29:14.180355 | orchestrator | Saturday 22 March 2025 23:23:44 +0000 (0:00:02.520) 0:03:39.925 ******** 2025-03-22 23:29:14.180369 | orchestrator | included: heat for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.180383 | orchestrator | 2025-03-22 23:29:14.180397 | orchestrator | TASK [haproxy-config : Copying over heat haproxy config] *********************** 2025-03-22 23:29:14.180410 | orchestrator | Saturday 22 March 2025 23:23:46 +0000 (0:00:01.339) 0:03:41.265 ******** 2025-03-22 23:29:14.180425 | orchestrator | changed: [testbed-node-0] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.180440 | orchestrator | changed: [testbed-node-1] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.180455 | orchestrator | changed: [testbed-node-2] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.180478 | orchestrator | changed: [testbed-node-0] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.180500 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.180515 | orchestrator | changed: [testbed-node-1] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.180546 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.180562 | orchestrator | changed: [testbed-node-2] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.180577 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.180598 | orchestrator | 2025-03-22 23:29:14.180618 | orchestrator | TASK [haproxy-config : Add configuration for heat when using single external frontend] *** 2025-03-22 23:29:14.180633 | orchestrator | Saturday 22 March 2025 23:23:54 +0000 (0:00:08.636) 0:03:49.901 ******** 2025-03-22 23:29:14.180647 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.180662 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.180677 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.180691 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.180705 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.180726 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.180768 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.180784 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.180798 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-api', 'value': {'container_name': 'heat_api', 'group': 'heat-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8004'], 'timeout': '30'}, 'haproxy': {'heat_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}, 'heat_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.180814 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-api-cfn', 'value': {'container_name': 'heat_api_cfn', 'group': 'heat-api-cfn', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-api-cfn:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-api-cfn/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8000'], 'timeout': '30'}, 'haproxy': {'heat_api_cfn': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}, 'heat_api_cfn_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.180828 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat-engine', 'value': {'container_name': 'heat_engine', 'group': 'heat-engine', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/heat-engine:22.0.2.20241206', 'volumes': ['/etc/kolla/heat-engine/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port heat-engine 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.180843 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.180857 | orchestrator | 2025-03-22 23:29:14.180871 | orchestrator | TASK [haproxy-config : Configuring firewall for heat] ************************** 2025-03-22 23:29:14.180885 | orchestrator | Saturday 22 March 2025 23:23:56 +0000 (0:00:01.428) 0:03:51.330 ******** 2025-03-22 23:29:14.180899 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-22 23:29:14.180921 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-22 23:29:14.180936 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api_cfn', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-22 23:29:14.180956 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'heat_api_cfn_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-22 23:29:14.180971 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.180986 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181000 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181019 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api_cfn', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181033 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'heat_api_cfn_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181048 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.181062 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181076 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8004', 'listen_port': '8004', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181090 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api_cfn', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181104 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'heat_api_cfn_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8000', 'listen_port': '8000', 'tls_backend': 'no'}})  2025-03-22 23:29:14.181118 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.181132 | orchestrator | 2025-03-22 23:29:14.181146 | orchestrator | TASK [proxysql-config : Copying over heat ProxySQL users config] *************** 2025-03-22 23:29:14.181160 | orchestrator | Saturday 22 March 2025 23:23:57 +0000 (0:00:01.603) 0:03:52.934 ******** 2025-03-22 23:29:14.181174 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.181187 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.181206 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.181220 | orchestrator | 2025-03-22 23:29:14.181234 | orchestrator | TASK [proxysql-config : Copying over heat ProxySQL rules config] *************** 2025-03-22 23:29:14.181248 | orchestrator | Saturday 22 March 2025 23:23:59 +0000 (0:00:01.956) 0:03:54.891 ******** 2025-03-22 23:29:14.181262 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.181276 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.181289 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.181303 | orchestrator | 2025-03-22 23:29:14.181317 | orchestrator | TASK [include_role : horizon] ************************************************** 2025-03-22 23:29:14.181337 | orchestrator | Saturday 22 March 2025 23:24:02 +0000 (0:00:02.156) 0:03:57.047 ******** 2025-03-22 23:29:14.181355 | orchestrator | included: horizon for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.181370 | orchestrator | 2025-03-22 23:29:14.181384 | orchestrator | TASK [haproxy-config : Copying over horizon haproxy config] ******************** 2025-03-22 23:29:14.181397 | orchestrator | Saturday 22 March 2025 23:24:03 +0000 (0:00:01.053) 0:03:58.101 ******** 2025-03-22 23:29:14.181421 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-03-22 23:29:14.181438 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-03-22 23:29:14.181469 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-03-22 23:29:14.181485 | orchestrator | 2025-03-22 23:29:14.181499 | orchestrator | TASK [haproxy-config : Add configuration for horizon when using single external frontend] *** 2025-03-22 23:29:14.181513 | orchestrator | Saturday 22 March 2025 23:24:08 +0000 (0:00:05.464) 0:04:03.565 ******** 2025-03-22 23:29:14.181528 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-03-22 23:29:14.181608 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.181634 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-03-22 23:29:14.181649 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.181664 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/horizon:24.0.1.20241206', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'yes', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-03-22 23:29:14.181687 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.181702 | orchestrator | 2025-03-22 23:29:14.181721 | orchestrator | TASK [haproxy-config : Configuring firewall for horizon] *********************** 2025-03-22 23:29:14.181736 | orchestrator | Saturday 22 March 2025 23:24:09 +0000 (0:00:01.305) 0:04:04.871 ******** 2025-03-22 23:29:14.181751 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-22 23:29:14.181766 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-22 23:29:14.181782 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-22 23:29:14.181797 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-22 23:29:14.181812 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-03-22 23:29:14.181826 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.181846 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-22 23:29:14.181885 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-22 23:29:14.181901 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-22 23:29:14.181915 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-22 23:29:14.181929 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-03-22 23:29:14.181944 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.181958 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-22 23:29:14.181973 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-22 23:29:14.181993 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-03-22 23:29:14.182007 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-03-22 23:29:14.182062 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-03-22 23:29:14.182080 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.182094 | orchestrator | 2025-03-22 23:29:14.182108 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL users config] ************ 2025-03-22 23:29:14.182123 | orchestrator | Saturday 22 March 2025 23:24:11 +0000 (0:00:01.454) 0:04:06.326 ******** 2025-03-22 23:29:14.182137 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.182150 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.182163 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.182175 | orchestrator | 2025-03-22 23:29:14.182187 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL rules config] ************ 2025-03-22 23:29:14.182200 | orchestrator | Saturday 22 March 2025 23:24:12 +0000 (0:00:01.617) 0:04:07.943 ******** 2025-03-22 23:29:14.182212 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.182225 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.182243 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.182256 | orchestrator | 2025-03-22 23:29:14.182269 | orchestrator | TASK [include_role : influxdb] ************************************************* 2025-03-22 23:29:14.182281 | orchestrator | Saturday 22 March 2025 23:24:15 +0000 (0:00:02.723) 0:04:10.666 ******** 2025-03-22 23:29:14.182294 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.182306 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.182318 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.182331 | orchestrator | 2025-03-22 23:29:14.182343 | orchestrator | TASK [include_role : ironic] *************************************************** 2025-03-22 23:29:14.182355 | orchestrator | Saturday 22 March 2025 23:24:16 +0000 (0:00:00.534) 0:04:11.200 ******** 2025-03-22 23:29:14.182368 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.182380 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.182392 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.182405 | orchestrator | 2025-03-22 23:29:14.182417 | orchestrator | TASK [include_role : keystone] ************************************************* 2025-03-22 23:29:14.182430 | orchestrator | Saturday 22 March 2025 23:24:16 +0000 (0:00:00.349) 0:04:11.549 ******** 2025-03-22 23:29:14.182442 | orchestrator | included: keystone for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.182454 | orchestrator | 2025-03-22 23:29:14.182466 | orchestrator | TASK [haproxy-config : Copying over keystone haproxy config] ******************* 2025-03-22 23:29:14.182479 | orchestrator | Saturday 22 March 2025 23:24:18 +0000 (0:00:01.473) 0:04:13.023 ******** 2025-03-22 23:29:14.182492 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}}) 2025-03-22 23:29:14.182506 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-22 23:29:14.182552 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-22 23:29:14.182568 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}}) 2025-03-22 23:29:14.182588 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-22 23:29:14.182602 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-22 23:29:14.182616 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}}) 2025-03-22 23:29:14.182635 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-22 23:29:14.182649 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-22 23:29:14.182668 | orchestrator | 2025-03-22 23:29:14.182681 | orchestrator | TASK [haproxy-config : Add configuration for keystone when using single external frontend] *** 2025-03-22 23:29:14.182693 | orchestrator | Saturday 22 March 2025 23:24:23 +0000 (0:00:05.933) 0:04:18.957 ******** 2025-03-22 23:29:14.182706 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}})  2025-03-22 23:29:14.182720 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-22 23:29:14.182733 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-22 23:29:14.182745 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.182764 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}})  2025-03-22 23:29:14.182783 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-22 23:29:14.182797 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-22 23:29:14.182809 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.182822 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}}}})  2025-03-22 23:29:14.182836 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-ssh:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-03-22 23:29:14.182849 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/keystone-fernet:25.0.1.20241206', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-03-22 23:29:14.182861 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.182874 | orchestrator | 2025-03-22 23:29:14.182886 | orchestrator | TASK [haproxy-config : Configuring firewall for keystone] ********************** 2025-03-22 23:29:14.182919 | orchestrator | Saturday 22 March 2025 23:24:25 +0000 (0:00:01.507) 0:04:20.465 ******** 2025-03-22 23:29:14.182937 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-22 23:29:14.182954 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-22 23:29:14.182967 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.182980 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-22 23:29:14.182993 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-22 23:29:14.183006 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.183018 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-22 23:29:14.183031 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance "roundrobin"']}})  2025-03-22 23:29:14.183044 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.183056 | orchestrator | 2025-03-22 23:29:14.183068 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL users config] *********** 2025-03-22 23:29:14.183081 | orchestrator | Saturday 22 March 2025 23:24:26 +0000 (0:00:01.200) 0:04:21.665 ******** 2025-03-22 23:29:14.183093 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.183105 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.183117 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.183129 | orchestrator | 2025-03-22 23:29:14.183142 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL rules config] *********** 2025-03-22 23:29:14.183154 | orchestrator | Saturday 22 March 2025 23:24:28 +0000 (0:00:01.641) 0:04:23.307 ******** 2025-03-22 23:29:14.183166 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.183178 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.183191 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.183203 | orchestrator | 2025-03-22 23:29:14.183215 | orchestrator | TASK [include_role : letsencrypt] ********************************************** 2025-03-22 23:29:14.183227 | orchestrator | Saturday 22 March 2025 23:24:31 +0000 (0:00:02.883) 0:04:26.190 ******** 2025-03-22 23:29:14.183240 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.183252 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.183264 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.183276 | orchestrator | 2025-03-22 23:29:14.183288 | orchestrator | TASK [include_role : magnum] *************************************************** 2025-03-22 23:29:14.183300 | orchestrator | Saturday 22 March 2025 23:24:31 +0000 (0:00:00.427) 0:04:26.617 ******** 2025-03-22 23:29:14.183313 | orchestrator | included: magnum for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.183325 | orchestrator | 2025-03-22 23:29:14.183338 | orchestrator | TASK [haproxy-config : Copying over magnum haproxy config] ********************* 2025-03-22 23:29:14.183350 | orchestrator | Saturday 22 March 2025 23:24:33 +0000 (0:00:01.798) 0:04:28.415 ******** 2025-03-22 23:29:14.183373 | orchestrator | changed: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-03-22 23:29:14.183392 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.183406 | orchestrator | changed: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-03-22 23:29:14.183420 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.183433 | orchestrator | changed: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-03-22 23:29:14.183464 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.183478 | orchestrator | 2025-03-22 23:29:14.183491 | orchestrator | TASK [haproxy-config : Add configuration for magnum when using single external frontend] *** 2025-03-22 23:29:14.183503 | orchestrator | Saturday 22 March 2025 23:24:39 +0000 (0:00:05.926) 0:04:34.342 ******** 2025-03-22 23:29:14.183522 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-03-22 23:29:14.183550 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.183564 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.183577 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-03-22 23:29:14.183609 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.183623 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.183642 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-api:18.0.1.20241206', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-03-22 23:29:14.183656 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/magnum-conductor:18.0.1.20241206', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.183669 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.183681 | orchestrator | 2025-03-22 23:29:14.183694 | orchestrator | TASK [haproxy-config : Configuring firewall for magnum] ************************ 2025-03-22 23:29:14.183706 | orchestrator | Saturday 22 March 2025 23:24:40 +0000 (0:00:01.350) 0:04:35.692 ******** 2025-03-22 23:29:14.183719 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-03-22 23:29:14.183731 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-03-22 23:29:14.183744 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-03-22 23:29:14.183761 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.183774 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-03-22 23:29:14.183786 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.183798 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-03-22 23:29:14.183817 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-03-22 23:29:14.183830 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.183842 | orchestrator | 2025-03-22 23:29:14.183855 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL users config] ************* 2025-03-22 23:29:14.183867 | orchestrator | Saturday 22 March 2025 23:24:42 +0000 (0:00:01.499) 0:04:37.192 ******** 2025-03-22 23:29:14.183879 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.183892 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.183904 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.183916 | orchestrator | 2025-03-22 23:29:14.183929 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL rules config] ************* 2025-03-22 23:29:14.183941 | orchestrator | Saturday 22 March 2025 23:24:43 +0000 (0:00:01.545) 0:04:38.737 ******** 2025-03-22 23:29:14.183953 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.183965 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.183978 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.183990 | orchestrator | 2025-03-22 23:29:14.184002 | orchestrator | TASK [include_role : manila] *************************************************** 2025-03-22 23:29:14.184014 | orchestrator | Saturday 22 March 2025 23:24:46 +0000 (0:00:02.773) 0:04:41.511 ******** 2025-03-22 23:29:14.184027 | orchestrator | included: manila for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.184039 | orchestrator | 2025-03-22 23:29:14.184051 | orchestrator | TASK [haproxy-config : Copying over manila haproxy config] ********************* 2025-03-22 23:29:14.184064 | orchestrator | Saturday 22 March 2025 23:24:47 +0000 (0:00:01.321) 0:04:42.833 ******** 2025-03-22 23:29:14.184082 | orchestrator | changed: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-03-22 23:29:14.184104 | orchestrator | changed: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-03-22 23:29:14.184117 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184137 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184151 | orchestrator | changed: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-03-22 23:29:14.184164 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184183 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184205 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184219 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184238 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184251 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184264 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184277 | orchestrator | 2025-03-22 23:29:14.184289 | orchestrator | TASK [haproxy-config : Add configuration for manila when using single external frontend] *** 2025-03-22 23:29:14.184302 | orchestrator | Saturday 22 March 2025 23:24:53 +0000 (0:00:05.345) 0:04:48.179 ******** 2025-03-22 23:29:14.184328 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-03-22 23:29:14.184341 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184355 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184377 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184390 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.184411 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-03-22 23:29:14.184425 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184443 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184457 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/release/manila-api:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-03-22 23:29:14.184490 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184504 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.184517 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/release/manila-scheduler:18.2.2.20241206', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184546 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/release/manila-share:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184569 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/release/manila-data:18.2.2.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.184583 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.184605 | orchestrator | 2025-03-22 23:29:14.184619 | orchestrator | TASK [haproxy-config : Configuring firewall for manila] ************************ 2025-03-22 23:29:14.184632 | orchestrator | Saturday 22 March 2025 23:24:54 +0000 (0:00:01.018) 0:04:49.197 ******** 2025-03-22 23:29:14.184645 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-03-22 23:29:14.184664 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-03-22 23:29:14.184677 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.184691 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-03-22 23:29:14.184703 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-03-22 23:29:14.184723 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.184736 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-03-22 23:29:14.184748 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-03-22 23:29:14.184761 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.184773 | orchestrator | 2025-03-22 23:29:14.184786 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL users config] ************* 2025-03-22 23:29:14.184798 | orchestrator | Saturday 22 March 2025 23:24:55 +0000 (0:00:01.546) 0:04:50.744 ******** 2025-03-22 23:29:14.184810 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.184822 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.184835 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.184847 | orchestrator | 2025-03-22 23:29:14.184859 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL rules config] ************* 2025-03-22 23:29:14.184872 | orchestrator | Saturday 22 March 2025 23:24:57 +0000 (0:00:01.569) 0:04:52.314 ******** 2025-03-22 23:29:14.184884 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.184896 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.184908 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.184920 | orchestrator | 2025-03-22 23:29:14.184933 | orchestrator | TASK [include_role : mariadb] ************************************************** 2025-03-22 23:29:14.184945 | orchestrator | Saturday 22 March 2025 23:25:00 +0000 (0:00:03.068) 0:04:55.382 ******** 2025-03-22 23:29:14.184957 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.184969 | orchestrator | 2025-03-22 23:29:14.184982 | orchestrator | TASK [mariadb : Ensure mysql monitor user exist] ******************************* 2025-03-22 23:29:14.184994 | orchestrator | Saturday 22 March 2025 23:25:02 +0000 (0:00:01.696) 0:04:57.078 ******** 2025-03-22 23:29:14.185006 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-03-22 23:29:14.185018 | orchestrator | 2025-03-22 23:29:14.185031 | orchestrator | TASK [haproxy-config : Copying over mariadb haproxy config] ******************** 2025-03-22 23:29:14.185043 | orchestrator | Saturday 22 March 2025 23:25:06 +0000 (0:00:04.040) 0:05:01.119 ******** 2025-03-22 23:29:14.185056 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-22 23:29:14.185090 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-22 23:29:14.185104 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.185117 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-22 23:29:14.185139 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-22 23:29:14.185152 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.185172 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-22 23:29:14.185192 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-22 23:29:14.185205 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.185217 | orchestrator | 2025-03-22 23:29:14.185230 | orchestrator | TASK [haproxy-config : Add configuration for mariadb when using single external frontend] *** 2025-03-22 23:29:14.185242 | orchestrator | Saturday 22 March 2025 23:25:11 +0000 (0:00:04.970) 0:05:06.090 ******** 2025-03-22 23:29:14.185255 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-22 23:29:14.185307 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-22 23:29:14.185322 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.185335 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-22 23:29:14.185348 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-22 23:29:14.185361 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.185388 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-server:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-03-22 23:29:14.185638 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/mariadb-clustercheck:10.11.10.20241206', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-03-22 23:29:14.185661 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.185672 | orchestrator | 2025-03-22 23:29:14.185682 | orchestrator | TASK [haproxy-config : Configuring firewall for mariadb] *********************** 2025-03-22 23:29:14.185692 | orchestrator | Saturday 22 March 2025 23:25:14 +0000 (0:00:03.864) 0:05:09.954 ******** 2025-03-22 23:29:14.185702 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-22 23:29:14.185714 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-22 23:29:14.185725 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.185735 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-22 23:29:14.185753 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-22 23:29:14.185764 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.185796 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', 'option httpchk'], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-22 23:29:14.185809 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 4569 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 4569 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 4569 inter 2000 rise 2 fall 5 backup', '']}})  2025-03-22 23:29:14.185819 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.185829 | orchestrator | 2025-03-22 23:29:14.185840 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL users config] ************ 2025-03-22 23:29:14.185850 | orchestrator | Saturday 22 March 2025 23:25:18 +0000 (0:00:03.795) 0:05:13.749 ******** 2025-03-22 23:29:14.185860 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.185870 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.185880 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.185890 | orchestrator | 2025-03-22 23:29:14.185900 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL rules config] ************ 2025-03-22 23:29:14.185910 | orchestrator | Saturday 22 March 2025 23:25:21 +0000 (0:00:02.696) 0:05:16.446 ******** 2025-03-22 23:29:14.185920 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.185930 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.185940 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.185950 | orchestrator | 2025-03-22 23:29:14.185960 | orchestrator | TASK [include_role : masakari] ************************************************* 2025-03-22 23:29:14.185970 | orchestrator | Saturday 22 March 2025 23:25:23 +0000 (0:00:02.162) 0:05:18.609 ******** 2025-03-22 23:29:14.185980 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.185990 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.186000 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.186010 | orchestrator | 2025-03-22 23:29:14.186043 | orchestrator | TASK [include_role : memcached] ************************************************ 2025-03-22 23:29:14.186055 | orchestrator | Saturday 22 March 2025 23:25:23 +0000 (0:00:00.350) 0:05:18.960 ******** 2025-03-22 23:29:14.186065 | orchestrator | included: memcached for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.186075 | orchestrator | 2025-03-22 23:29:14.186085 | orchestrator | TASK [haproxy-config : Copying over memcached haproxy config] ****************** 2025-03-22 23:29:14.186095 | orchestrator | Saturday 22 March 2025 23:25:25 +0000 (0:00:01.733) 0:05:20.693 ******** 2025-03-22 23:29:14.186112 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-03-22 23:29:14.186123 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-03-22 23:29:14.186140 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-03-22 23:29:14.186151 | orchestrator | 2025-03-22 23:29:14.186162 | orchestrator | TASK [haproxy-config : Add configuration for memcached when using single external frontend] *** 2025-03-22 23:29:14.186174 | orchestrator | Saturday 22 March 2025 23:25:27 +0000 (0:00:01.922) 0:05:22.616 ******** 2025-03-22 23:29:14.186185 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-03-22 23:29:14.186196 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.186216 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-03-22 23:29:14.186236 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.186249 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/release/memcached:1.6.14.20241206', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-03-22 23:29:14.186261 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.186272 | orchestrator | 2025-03-22 23:29:14.186282 | orchestrator | TASK [haproxy-config : Configuring firewall for memcached] ********************* 2025-03-22 23:29:14.186294 | orchestrator | Saturday 22 March 2025 23:25:28 +0000 (0:00:00.689) 0:05:23.305 ******** 2025-03-22 23:29:14.186305 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-03-22 23:29:14.186317 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-03-22 23:29:14.186329 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.186339 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.186350 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-03-22 23:29:14.186362 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.186373 | orchestrator | 2025-03-22 23:29:14.186388 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL users config] ********** 2025-03-22 23:29:14.186400 | orchestrator | Saturday 22 March 2025 23:25:29 +0000 (0:00:00.955) 0:05:24.260 ******** 2025-03-22 23:29:14.186411 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.186422 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.186432 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.186443 | orchestrator | 2025-03-22 23:29:14.186454 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL rules config] ********** 2025-03-22 23:29:14.186465 | orchestrator | Saturday 22 March 2025 23:25:30 +0000 (0:00:00.812) 0:05:25.072 ******** 2025-03-22 23:29:14.186476 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.186487 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.186498 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.186510 | orchestrator | 2025-03-22 23:29:14.186521 | orchestrator | TASK [include_role : mistral] ************************************************** 2025-03-22 23:29:14.186547 | orchestrator | Saturday 22 March 2025 23:25:31 +0000 (0:00:01.901) 0:05:26.974 ******** 2025-03-22 23:29:14.186558 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.186568 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.186578 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.186588 | orchestrator | 2025-03-22 23:29:14.186598 | orchestrator | TASK [include_role : neutron] ************************************************** 2025-03-22 23:29:14.186630 | orchestrator | Saturday 22 March 2025 23:25:32 +0000 (0:00:00.332) 0:05:27.306 ******** 2025-03-22 23:29:14.186641 | orchestrator | included: neutron for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.186652 | orchestrator | 2025-03-22 23:29:14.186662 | orchestrator | TASK [haproxy-config : Copying over neutron haproxy config] ******************** 2025-03-22 23:29:14.186672 | orchestrator | Saturday 22 March 2025 23:25:34 +0000 (0:00:01.813) 0:05:29.120 ******** 2025-03-22 23:29:14.186682 | orchestrator | changed: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-03-22 23:29:14.186693 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186705 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186721 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186752 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-22 23:29:14.186768 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186783 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.186795 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.186806 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186821 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.186832 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186848 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.186859 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.186870 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186888 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.186899 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.186914 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186930 | orchestrator | changed: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-03-22 23:29:14.186948 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186959 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186970 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.186985 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-22 23:29:14.187003 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187014 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187024 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187042 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187053 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.187067 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187084 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.187095 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187105 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187122 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.187134 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.187148 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187166 | orchestrator | changed: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-03-22 23:29:14.187177 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187194 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187205 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187216 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-22 23:29:14.187239 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187251 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187261 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187279 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187290 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.187300 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187321 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.187332 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187343 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187360 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.187371 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.187382 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187399 | orchestrator | 2025-03-22 23:29:14.187409 | orchestrator | TASK [haproxy-config : Add configuration for neutron when using single external frontend] *** 2025-03-22 23:29:14.187419 | orchestrator | Saturday 22 March 2025 23:25:40 +0000 (0:00:06.434) 0:05:35.554 ******** 2025-03-22 23:29:14.187434 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-03-22 23:29:14.187452 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187463 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187474 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187484 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-22 23:29:14.187504 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187515 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187525 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187559 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187570 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-03-22 23:29:14.187581 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.187601 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187612 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187629 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187641 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187651 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.187666 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187681 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-22 23:29:14.187692 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187703 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187721 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.187732 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187747 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.187763 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187869 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.187884 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.187905 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187916 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.187927 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.187944 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.187955 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.188024 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188052 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.188064 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.188075 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188091 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.188102 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/release/neutron-server:24.0.2.20241206', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-03-22 23:29:14.188190 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/release/neutron-openvswitch-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188208 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/release/neutron-linuxbridge-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188219 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/release/neutron-dhcp-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188230 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/release/neutron-l3-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-03-22 23:29:14.188247 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/release/neutron-sriov-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188258 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/release/neutron-mlnx-agent:24.0.2.20241206', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.188320 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/release/neutron-eswitchd:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.188344 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188356 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metadata-agent:24.0.2.20241206', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.188366 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/release/neutron-bgp-dragent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188385 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/release/neutron-infoblox-ipam-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.188396 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/release/neutron-metering-agent:24.0.2.20241206', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-03-22 23:29:14.188407 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/release/ironic-neutron-agent:24.0.2.20241206', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188489 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/neutron-tls-proxy:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-03-22 23:29:14.188507 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'index.docker.io/kolla/release/neutron-ovn-agent:24.0.2.20241206', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-03-22 23:29:14.188518 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-vpn-agent', 'value': {'container_name': 'neutron_ovn_vpn_agent', 'image': 'index.docker.io/kolla/release/neutron-ovn-vpn-agent:24.0.2.20241206', 'enabled': False, 'privileged': True, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-ovn-vpn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port python 6642', '&&', 'healthcheck_port neutron-ovn-vpn-agent 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.188585 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.188598 | orchestrator | 2025-03-22 23:29:14.188608 | orchestrator | TASK [haproxy-config : Configuring firewall for neutron] *********************** 2025-03-22 23:29:14.188619 | orchestrator | Saturday 22 March 2025 23:25:42 +0000 (0:00:02.230) 0:05:37.784 ******** 2025-03-22 23:29:14.188629 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-03-22 23:29:14.188639 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-03-22 23:29:14.188650 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.188664 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-03-22 23:29:14.188675 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-03-22 23:29:14.188685 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.188696 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-03-22 23:29:14.188706 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-03-22 23:29:14.188716 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.188726 | orchestrator | 2025-03-22 23:29:14.188737 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL users config] ************ 2025-03-22 23:29:14.188766 | orchestrator | Saturday 22 March 2025 23:25:45 +0000 (0:00:02.646) 0:05:40.431 ******** 2025-03-22 23:29:14.188777 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.188788 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.188830 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.188843 | orchestrator | 2025-03-22 23:29:14.188853 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL rules config] ************ 2025-03-22 23:29:14.188863 | orchestrator | Saturday 22 March 2025 23:25:47 +0000 (0:00:01.596) 0:05:42.027 ******** 2025-03-22 23:29:14.188873 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.188884 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.188894 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.188904 | orchestrator | 2025-03-22 23:29:14.188914 | orchestrator | TASK [include_role : placement] ************************************************ 2025-03-22 23:29:14.188924 | orchestrator | Saturday 22 March 2025 23:25:49 +0000 (0:00:02.890) 0:05:44.918 ******** 2025-03-22 23:29:14.188934 | orchestrator | included: placement for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.188944 | orchestrator | 2025-03-22 23:29:14.188954 | orchestrator | TASK [haproxy-config : Copying over placement haproxy config] ****************** 2025-03-22 23:29:14.188965 | orchestrator | Saturday 22 March 2025 23:25:51 +0000 (0:00:01.828) 0:05:46.746 ******** 2025-03-22 23:29:14.188975 | orchestrator | changed: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.189013 | orchestrator | changed: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.189025 | orchestrator | changed: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.189036 | orchestrator | 2025-03-22 23:29:14.189046 | orchestrator | TASK [haproxy-config : Add configuration for placement when using single external frontend] *** 2025-03-22 23:29:14.189057 | orchestrator | Saturday 22 March 2025 23:25:57 +0000 (0:00:05.395) 0:05:52.141 ******** 2025-03-22 23:29:14.189092 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.189105 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.189115 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.189129 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.189146 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/release/placement-api:11.0.0.20241206', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.189156 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.189165 | orchestrator | 2025-03-22 23:29:14.189175 | orchestrator | TASK [haproxy-config : Configuring firewall for placement] ********************* 2025-03-22 23:29:14.189184 | orchestrator | Saturday 22 March 2025 23:25:57 +0000 (0:00:00.534) 0:05:52.676 ******** 2025-03-22 23:29:14.189193 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189203 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189213 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.189223 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189232 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189242 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.189252 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189261 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189271 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.189280 | orchestrator | 2025-03-22 23:29:14.189289 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL users config] ********** 2025-03-22 23:29:14.189321 | orchestrator | Saturday 22 March 2025 23:25:59 +0000 (0:00:01.424) 0:05:54.101 ******** 2025-03-22 23:29:14.189333 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.189342 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.189351 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.189361 | orchestrator | 2025-03-22 23:29:14.189371 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL rules config] ********** 2025-03-22 23:29:14.189380 | orchestrator | Saturday 22 March 2025 23:26:00 +0000 (0:00:01.550) 0:05:55.652 ******** 2025-03-22 23:29:14.189390 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.189399 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.189409 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.189418 | orchestrator | 2025-03-22 23:29:14.189427 | orchestrator | TASK [include_role : nova] ***************************************************** 2025-03-22 23:29:14.189437 | orchestrator | Saturday 22 March 2025 23:26:03 +0000 (0:00:02.783) 0:05:58.436 ******** 2025-03-22 23:29:14.189446 | orchestrator | included: nova for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.189456 | orchestrator | 2025-03-22 23:29:14.189464 | orchestrator | TASK [haproxy-config : Copying over nova haproxy config] *********************** 2025-03-22 23:29:14.189473 | orchestrator | Saturday 22 March 2025 23:26:05 +0000 (0:00:01.942) 0:06:00.379 ******** 2025-03-22 23:29:14.189482 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.189491 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189500 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189527 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.189582 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.189593 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189602 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189611 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189645 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189656 | orchestrator | 2025-03-22 23:29:14.189665 | orchestrator | TASK [haproxy-config : Add configuration for nova when using single external frontend] *** 2025-03-22 23:29:14.189674 | orchestrator | Saturday 22 March 2025 23:26:12 +0000 (0:00:06.775) 0:06:07.154 ******** 2025-03-22 23:29:14.189690 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.189699 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189709 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189717 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.189727 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.189765 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189777 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189786 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.189795 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/release/nova-api:29.2.1.20241206', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.189804 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/release/nova-scheduler:29.2.1.20241206', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189818 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/release/nova-super-conductor:29.2.1.20241206', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.189828 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.189836 | orchestrator | 2025-03-22 23:29:14.189845 | orchestrator | TASK [haproxy-config : Configuring firewall for nova] ************************** 2025-03-22 23:29:14.189854 | orchestrator | Saturday 22 March 2025 23:26:13 +0000 (0:00:01.489) 0:06:08.644 ******** 2025-03-22 23:29:14.189863 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189889 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189900 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189909 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189918 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.189927 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189936 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189945 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189953 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189962 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.189971 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189980 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189989 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-22 23:29:14.189998 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-03-22 23:29:14.190012 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.190043 | orchestrator | 2025-03-22 23:29:14.190052 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL users config] *************** 2025-03-22 23:29:14.190061 | orchestrator | Saturday 22 March 2025 23:26:15 +0000 (0:00:01.744) 0:06:10.388 ******** 2025-03-22 23:29:14.190070 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.190078 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.190087 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.190095 | orchestrator | 2025-03-22 23:29:14.190104 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL rules config] *************** 2025-03-22 23:29:14.190112 | orchestrator | Saturday 22 March 2025 23:26:17 +0000 (0:00:01.651) 0:06:12.040 ******** 2025-03-22 23:29:14.190121 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.190129 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.190138 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.190146 | orchestrator | 2025-03-22 23:29:14.190154 | orchestrator | TASK [include_role : nova-cell] ************************************************ 2025-03-22 23:29:14.190163 | orchestrator | Saturday 22 March 2025 23:26:19 +0000 (0:00:02.851) 0:06:14.891 ******** 2025-03-22 23:29:14.190171 | orchestrator | included: nova-cell for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.190180 | orchestrator | 2025-03-22 23:29:14.190188 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-novncproxy] ****************** 2025-03-22 23:29:14.190197 | orchestrator | Saturday 22 March 2025 23:26:21 +0000 (0:00:01.978) 0:06:16.869 ******** 2025-03-22 23:29:14.190205 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-novncproxy) 2025-03-22 23:29:14.190214 | orchestrator | 2025-03-22 23:29:14.190226 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config] *** 2025-03-22 23:29:14.190235 | orchestrator | Saturday 22 March 2025 23:26:23 +0000 (0:00:01.546) 0:06:18.416 ******** 2025-03-22 23:29:14.190264 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-03-22 23:29:14.190279 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-03-22 23:29:14.190292 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-03-22 23:29:14.190301 | orchestrator | 2025-03-22 23:29:14.190310 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-novncproxy when using single external frontend] *** 2025-03-22 23:29:14.190319 | orchestrator | Saturday 22 March 2025 23:26:30 +0000 (0:00:06.806) 0:06:25.223 ******** 2025-03-22 23:29:14.190328 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190342 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.190351 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190360 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.190369 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190378 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.190387 | orchestrator | 2025-03-22 23:29:14.190395 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-novncproxy] ***** 2025-03-22 23:29:14.190404 | orchestrator | Saturday 22 March 2025 23:26:32 +0000 (0:00:02.297) 0:06:27.520 ******** 2025-03-22 23:29:14.190413 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-22 23:29:14.190422 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-22 23:29:14.190431 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.190439 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-22 23:29:14.190467 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-22 23:29:14.190480 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-22 23:29:14.190490 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.190499 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-03-22 23:29:14.190507 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.190516 | orchestrator | 2025-03-22 23:29:14.190524 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-03-22 23:29:14.190549 | orchestrator | Saturday 22 March 2025 23:26:34 +0000 (0:00:02.179) 0:06:29.700 ******** 2025-03-22 23:29:14.190558 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.190567 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.190580 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.190589 | orchestrator | 2025-03-22 23:29:14.190598 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-03-22 23:29:14.190606 | orchestrator | Saturday 22 March 2025 23:26:37 +0000 (0:00:03.259) 0:06:32.959 ******** 2025-03-22 23:29:14.190615 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.190623 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.190632 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.190641 | orchestrator | 2025-03-22 23:29:14.190649 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-spicehtml5proxy] ************* 2025-03-22 23:29:14.190658 | orchestrator | Saturday 22 March 2025 23:26:42 +0000 (0:00:04.224) 0:06:37.184 ******** 2025-03-22 23:29:14.190682 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-spicehtml5proxy) 2025-03-22 23:29:14.190692 | orchestrator | 2025-03-22 23:29:14.190701 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-spicehtml5proxy haproxy config] *** 2025-03-22 23:29:14.190709 | orchestrator | Saturday 22 March 2025 23:26:43 +0000 (0:00:01.476) 0:06:38.660 ******** 2025-03-22 23:29:14.190718 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190727 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.190736 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190745 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.190754 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190763 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.190771 | orchestrator | 2025-03-22 23:29:14.190780 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-spicehtml5proxy when using single external frontend] *** 2025-03-22 23:29:14.190788 | orchestrator | Saturday 22 March 2025 23:26:45 +0000 (0:00:02.246) 0:06:40.907 ******** 2025-03-22 23:29:14.190825 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190836 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.190845 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190859 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.190868 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-03-22 23:29:14.190877 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.190886 | orchestrator | 2025-03-22 23:29:14.190895 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-spicehtml5proxy] *** 2025-03-22 23:29:14.190903 | orchestrator | Saturday 22 March 2025 23:26:47 +0000 (0:00:01.960) 0:06:42.867 ******** 2025-03-22 23:29:14.190912 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.190921 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.190929 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.190938 | orchestrator | 2025-03-22 23:29:14.190946 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-03-22 23:29:14.190955 | orchestrator | Saturday 22 March 2025 23:26:50 +0000 (0:00:02.747) 0:06:45.615 ******** 2025-03-22 23:29:14.190964 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.190973 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.190988 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.190997 | orchestrator | 2025-03-22 23:29:14.191006 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-03-22 23:29:14.191014 | orchestrator | Saturday 22 March 2025 23:26:53 +0000 (0:00:03.150) 0:06:48.765 ******** 2025-03-22 23:29:14.191023 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.191031 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.191040 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.191048 | orchestrator | 2025-03-22 23:29:14.191057 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-serialproxy] ***************** 2025-03-22 23:29:14.191066 | orchestrator | Saturday 22 March 2025 23:26:57 +0000 (0:00:03.900) 0:06:52.666 ******** 2025-03-22 23:29:14.191074 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-serialproxy) 2025-03-22 23:29:14.191083 | orchestrator | 2025-03-22 23:29:14.191092 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-serialproxy haproxy config] *** 2025-03-22 23:29:14.191100 | orchestrator | Saturday 22 March 2025 23:26:59 +0000 (0:00:01.777) 0:06:54.443 ******** 2025-03-22 23:29:14.191109 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-22 23:29:14.191118 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.191127 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-22 23:29:14.191150 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.191180 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-22 23:29:14.191191 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.191199 | orchestrator | 2025-03-22 23:29:14.191208 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-serialproxy when using single external frontend] *** 2025-03-22 23:29:14.191217 | orchestrator | Saturday 22 March 2025 23:27:01 +0000 (0:00:02.061) 0:06:56.505 ******** 2025-03-22 23:29:14.191232 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-22 23:29:14.191241 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.191250 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-22 23:29:14.191259 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.191268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-03-22 23:29:14.191277 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.191286 | orchestrator | 2025-03-22 23:29:14.191294 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-serialproxy] **** 2025-03-22 23:29:14.191303 | orchestrator | Saturday 22 March 2025 23:27:03 +0000 (0:00:01.604) 0:06:58.110 ******** 2025-03-22 23:29:14.191311 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.191320 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.191431 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.191442 | orchestrator | 2025-03-22 23:29:14.191451 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-03-22 23:29:14.191460 | orchestrator | Saturday 22 March 2025 23:27:05 +0000 (0:00:02.703) 0:07:00.813 ******** 2025-03-22 23:29:14.191469 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.191477 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.191486 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.191494 | orchestrator | 2025-03-22 23:29:14.191509 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-03-22 23:29:14.191518 | orchestrator | Saturday 22 March 2025 23:27:08 +0000 (0:00:03.163) 0:07:03.977 ******** 2025-03-22 23:29:14.191526 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.191549 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.191558 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.191567 | orchestrator | 2025-03-22 23:29:14.191576 | orchestrator | TASK [include_role : octavia] ************************************************** 2025-03-22 23:29:14.191587 | orchestrator | Saturday 22 March 2025 23:27:13 +0000 (0:00:04.245) 0:07:08.223 ******** 2025-03-22 23:29:14.191596 | orchestrator | included: octavia for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.191605 | orchestrator | 2025-03-22 23:29:14.191613 | orchestrator | TASK [haproxy-config : Copying over octavia haproxy config] ******************** 2025-03-22 23:29:14.191622 | orchestrator | Saturday 22 March 2025 23:27:15 +0000 (0:00:01.980) 0:07:10.203 ******** 2025-03-22 23:29:14.191652 | orchestrator | changed: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.191664 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-22 23:29:14.191673 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191682 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191691 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.191706 | orchestrator | changed: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.191715 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-22 23:29:14.191742 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191753 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191762 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.191771 | orchestrator | changed: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.191784 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-22 23:29:14.191797 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191824 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191835 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.191844 | orchestrator | 2025-03-22 23:29:14.191853 | orchestrator | TASK [haproxy-config : Add configuration for octavia when using single external frontend] *** 2025-03-22 23:29:14.191862 | orchestrator | Saturday 22 March 2025 23:27:20 +0000 (0:00:05.115) 0:07:15.318 ******** 2025-03-22 23:29:14.191871 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.191884 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-22 23:29:14.191893 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191902 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191928 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.191940 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.191948 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.191957 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-22 23:29:14.191971 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191980 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.191989 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.191998 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.192026 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-api:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.192037 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-driver-agent:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-03-22 23:29:14.192046 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-health-manager:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.192059 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-housekeeping:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-03-22 23:29:14.192068 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/octavia-worker:14.0.1.20241206', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-03-22 23:29:14.192077 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.192086 | orchestrator | 2025-03-22 23:29:14.192094 | orchestrator | TASK [haproxy-config : Configuring firewall for octavia] *********************** 2025-03-22 23:29:14.192103 | orchestrator | Saturday 22 March 2025 23:27:21 +0000 (0:00:01.130) 0:07:16.449 ******** 2025-03-22 23:29:14.192111 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-22 23:29:14.192120 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-22 23:29:14.192129 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.192138 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-22 23:29:14.192147 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-22 23:29:14.192155 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.192182 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-22 23:29:14.192193 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-03-22 23:29:14.192202 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.192210 | orchestrator | 2025-03-22 23:29:14.192219 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL users config] ************ 2025-03-22 23:29:14.192228 | orchestrator | Saturday 22 March 2025 23:27:22 +0000 (0:00:01.488) 0:07:17.938 ******** 2025-03-22 23:29:14.192236 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.192244 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.192259 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.192268 | orchestrator | 2025-03-22 23:29:14.192276 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL rules config] ************ 2025-03-22 23:29:14.192285 | orchestrator | Saturday 22 March 2025 23:27:24 +0000 (0:00:01.609) 0:07:19.547 ******** 2025-03-22 23:29:14.192293 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.192302 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.192311 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.192320 | orchestrator | 2025-03-22 23:29:14.192328 | orchestrator | TASK [include_role : opensearch] *********************************************** 2025-03-22 23:29:14.192336 | orchestrator | Saturday 22 March 2025 23:27:27 +0000 (0:00:02.743) 0:07:22.291 ******** 2025-03-22 23:29:14.192345 | orchestrator | included: opensearch for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.192353 | orchestrator | 2025-03-22 23:29:14.192362 | orchestrator | TASK [haproxy-config : Copying over opensearch haproxy config] ***************** 2025-03-22 23:29:14.192370 | orchestrator | Saturday 22 March 2025 23:27:29 +0000 (0:00:02.006) 0:07:24.298 ******** 2025-03-22 23:29:14.192379 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-03-22 23:29:14.192388 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-03-22 23:29:14.192398 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-03-22 23:29:14.192426 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-03-22 23:29:14.192443 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-03-22 23:29:14.192453 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-03-22 23:29:14.192462 | orchestrator | 2025-03-22 23:29:14.192471 | orchestrator | TASK [haproxy-config : Add configuration for opensearch when using single external frontend] *** 2025-03-22 23:29:14.192480 | orchestrator | Saturday 22 March 2025 23:27:37 +0000 (0:00:08.357) 0:07:32.655 ******** 2025-03-22 23:29:14.192506 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-03-22 23:29:14.192545 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-03-22 23:29:14.192557 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.192566 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-03-22 23:29:14.192575 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-03-22 23:29:14.192585 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.192594 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/opensearch:2.18.0.20241206', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-03-22 23:29:14.192638 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/release/opensearch-dashboards:2.18.0.20241206', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-03-22 23:29:14.192650 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.192658 | orchestrator | 2025-03-22 23:29:14.192667 | orchestrator | TASK [haproxy-config : Configuring firewall for opensearch] ******************** 2025-03-22 23:29:14.192676 | orchestrator | Saturday 22 March 2025 23:27:38 +0000 (0:00:01.067) 0:07:33.722 ******** 2025-03-22 23:29:14.192685 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-03-22 23:29:14.192694 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-22 23:29:14.192703 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-22 23:29:14.192712 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.192721 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-03-22 23:29:14.192730 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-22 23:29:14.192738 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-22 23:29:14.192747 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.192762 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-03-22 23:29:14.192770 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-22 23:29:14.192779 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-03-22 23:29:14.192788 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.192797 | orchestrator | 2025-03-22 23:29:14.192806 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL users config] ********* 2025-03-22 23:29:14.192819 | orchestrator | Saturday 22 March 2025 23:27:40 +0000 (0:00:01.893) 0:07:35.616 ******** 2025-03-22 23:29:14.192828 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.192836 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.192845 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.192853 | orchestrator | 2025-03-22 23:29:14.192862 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL rules config] ********* 2025-03-22 23:29:14.192870 | orchestrator | Saturday 22 March 2025 23:27:41 +0000 (0:00:00.510) 0:07:36.126 ******** 2025-03-22 23:29:14.192879 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.192887 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.192896 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.192904 | orchestrator | 2025-03-22 23:29:14.192913 | orchestrator | TASK [include_role : prometheus] *********************************************** 2025-03-22 23:29:14.192921 | orchestrator | Saturday 22 March 2025 23:27:43 +0000 (0:00:01.907) 0:07:38.033 ******** 2025-03-22 23:29:14.192948 | orchestrator | included: prometheus for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.192959 | orchestrator | 2025-03-22 23:29:14.192968 | orchestrator | TASK [haproxy-config : Copying over prometheus haproxy config] ***************** 2025-03-22 23:29:14.192976 | orchestrator | Saturday 22 March 2025 23:27:45 +0000 (0:00:02.302) 0:07:40.336 ******** 2025-03-22 23:29:14.192985 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-03-22 23:29:14.192994 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-22 23:29:14.193003 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193012 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193021 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193046 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-03-22 23:29:14.193075 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-22 23:29:14.193085 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193095 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193104 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193113 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-03-22 23:29:14.193126 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-22 23:29:14.193135 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193162 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193173 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193183 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-03-22 23:29:14.193192 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-22 23:29:14.193206 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193215 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193244 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193255 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193264 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-03-22 23:29:14.193274 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-22 23:29:14.193297 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193306 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193315 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193328 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193337 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-03-22 23:29:14.193347 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-22 23:29:14.193361 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193370 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193379 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193392 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193401 | orchestrator | 2025-03-22 23:29:14.193410 | orchestrator | TASK [haproxy-config : Add configuration for prometheus when using single external frontend] *** 2025-03-22 23:29:14.193419 | orchestrator | Saturday 22 March 2025 23:27:51 +0000 (0:00:05.890) 0:07:46.227 ******** 2025-03-22 23:29:14.193428 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-03-22 23:29:14.193437 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-22 23:29:14.193450 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193467 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193477 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193490 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-03-22 23:29:14.193499 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-22 23:29:14.193509 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193522 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193581 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193592 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193601 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.193615 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-03-22 23:29:14.193624 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-22 23:29:14.193633 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193649 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193665 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193674 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-03-22 23:29:14.193687 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-22 23:29:14.193697 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193706 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193721 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193735 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193744 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.193753 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-v2-server:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-03-22 23:29:14.193762 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-node-exporter:1.7.0.20241206', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-03-22 23:29:14.193771 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-mysqld-exporter:0.15.1.20241206', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193784 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-memcached-exporter:0.14.2.20241206', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193796 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-cadvisor:0.49.1.20241206', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193815 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-alertmanager:0.27.0.20241206', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-03-22 23:29:14.193825 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/release/prometheus-openstack-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-03-22 23:29:14.193833 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-elasticsearch-exporter:1.7.0.20241206', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193842 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-blackbox-exporter:0.24.0.20241206', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193856 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/prometheus-libvirt-exporter:8.1.0.20241206', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-03-22 23:29:14.193871 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-msteams', 'value': {'container_name': 'prometheus_msteams', 'group': 'prometheus-msteams', 'enabled': False, 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'image': 'index.docker.io/kolla/release/prometheus-msteams:2.50.1.20241206', 'volumes': ['/etc/kolla/prometheus-msteams/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-03-22 23:29:14.193884 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.193893 | orchestrator | 2025-03-22 23:29:14.193902 | orchestrator | TASK [haproxy-config : Configuring firewall for prometheus] ******************** 2025-03-22 23:29:14.193924 | orchestrator | Saturday 22 March 2025 23:27:53 +0000 (0:00:01.774) 0:07:48.001 ******** 2025-03-22 23:29:14.193934 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-03-22 23:29:14.193943 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-03-22 23:29:14.193953 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-03-22 23:29:14.193962 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-22 23:29:14.193971 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-03-22 23:29:14.193979 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-22 23:29:14.193988 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.193997 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-22 23:29:14.194006 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-22 23:29:14.194035 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194044 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-03-22 23:29:14.194057 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-03-22 23:29:14.194067 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-22 23:29:14.194080 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-03-22 23:29:14.194089 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194101 | orchestrator | 2025-03-22 23:29:14.194109 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL users config] ********* 2025-03-22 23:29:14.194117 | orchestrator | Saturday 22 March 2025 23:27:54 +0000 (0:00:01.995) 0:07:49.996 ******** 2025-03-22 23:29:14.194125 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194133 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194145 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194153 | orchestrator | 2025-03-22 23:29:14.194161 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL rules config] ********* 2025-03-22 23:29:14.194168 | orchestrator | Saturday 22 March 2025 23:27:55 +0000 (0:00:00.924) 0:07:50.921 ******** 2025-03-22 23:29:14.194176 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194184 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194192 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194200 | orchestrator | 2025-03-22 23:29:14.194208 | orchestrator | TASK [include_role : rabbitmq] ************************************************* 2025-03-22 23:29:14.194216 | orchestrator | Saturday 22 March 2025 23:27:58 +0000 (0:00:02.293) 0:07:53.214 ******** 2025-03-22 23:29:14.194224 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.194232 | orchestrator | 2025-03-22 23:29:14.194240 | orchestrator | TASK [haproxy-config : Copying over rabbitmq haproxy config] ******************* 2025-03-22 23:29:14.194248 | orchestrator | Saturday 22 March 2025 23:28:00 +0000 (0:00:02.182) 0:07:55.397 ******** 2025-03-22 23:29:14.194256 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:29:14.194265 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:29:14.194283 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-03-22 23:29:14.194299 | orchestrator | 2025-03-22 23:29:14.194307 | orchestrator | TASK [haproxy-config : Add configuration for rabbitmq when using single external frontend] *** 2025-03-22 23:29:14.194315 | orchestrator | Saturday 22 March 2025 23:28:03 +0000 (0:00:03.308) 0:07:58.705 ******** 2025-03-22 23:29:14.194323 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-03-22 23:29:14.194332 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194340 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-03-22 23:29:14.194348 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194356 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/release/rabbitmq:3.13.7.20241206', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-03-22 23:29:14.194370 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194378 | orchestrator | 2025-03-22 23:29:14.194386 | orchestrator | TASK [haproxy-config : Configuring firewall for rabbitmq] ********************** 2025-03-22 23:29:14.194399 | orchestrator | Saturday 22 March 2025 23:28:04 +0000 (0:00:00.997) 0:07:59.702 ******** 2025-03-22 23:29:14.194407 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-03-22 23:29:14.194415 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-03-22 23:29:14.194423 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194431 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194442 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-03-22 23:29:14.194450 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194458 | orchestrator | 2025-03-22 23:29:14.194466 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL users config] *********** 2025-03-22 23:29:14.194474 | orchestrator | Saturday 22 March 2025 23:28:06 +0000 (0:00:01.329) 0:08:01.032 ******** 2025-03-22 23:29:14.194482 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194490 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194498 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194506 | orchestrator | 2025-03-22 23:29:14.194514 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL rules config] *********** 2025-03-22 23:29:14.194522 | orchestrator | Saturday 22 March 2025 23:28:06 +0000 (0:00:00.542) 0:08:01.574 ******** 2025-03-22 23:29:14.194546 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194555 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194563 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194571 | orchestrator | 2025-03-22 23:29:14.194579 | orchestrator | TASK [include_role : skyline] ************************************************** 2025-03-22 23:29:14.194587 | orchestrator | Saturday 22 March 2025 23:28:08 +0000 (0:00:02.050) 0:08:03.625 ******** 2025-03-22 23:29:14.194595 | orchestrator | included: skyline for testbed-node-0, testbed-node-1, testbed-node-2 2025-03-22 23:29:14.194603 | orchestrator | 2025-03-22 23:29:14.194611 | orchestrator | TASK [haproxy-config : Copying over skyline haproxy config] ******************** 2025-03-22 23:29:14.194619 | orchestrator | Saturday 22 March 2025 23:28:11 +0000 (0:00:02.383) 0:08:06.008 ******** 2025-03-22 23:29:14.194627 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.194636 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.194649 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.194661 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.194676 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.194685 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-03-22 23:29:14.194697 | orchestrator | 2025-03-22 23:29:14.194706 | orchestrator | TASK [haproxy-config : Add configuration for skyline when using single external frontend] *** 2025-03-22 23:29:14.194714 | orchestrator | Saturday 22 March 2025 23:28:21 +0000 (0:00:10.467) 0:08:16.476 ******** 2025-03-22 23:29:14.194722 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.194734 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.194742 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194758 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.194766 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.194779 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-apiserver:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.194787 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194801 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/release/skyline-console:4.0.2.20241206', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-03-22 23:29:14.194810 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194818 | orchestrator | 2025-03-22 23:29:14.194826 | orchestrator | TASK [haproxy-config : Configuring firewall for skyline] *********************** 2025-03-22 23:29:14.194834 | orchestrator | Saturday 22 March 2025 23:28:22 +0000 (0:00:01.306) 0:08:17.783 ******** 2025-03-22 23:29:14.194842 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194850 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194858 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194866 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194875 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194882 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194891 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194903 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.194911 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194919 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.194927 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194935 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194944 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194952 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-03-22 23:29:14.194960 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.194968 | orchestrator | 2025-03-22 23:29:14.194976 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL users config] ************ 2025-03-22 23:29:14.194984 | orchestrator | Saturday 22 March 2025 23:28:24 +0000 (0:00:01.807) 0:08:19.590 ******** 2025-03-22 23:29:14.194992 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.195000 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.195008 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.195016 | orchestrator | 2025-03-22 23:29:14.195024 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL rules config] ************ 2025-03-22 23:29:14.195032 | orchestrator | Saturday 22 March 2025 23:28:26 +0000 (0:00:01.706) 0:08:21.297 ******** 2025-03-22 23:29:14.195039 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.195047 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.195055 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.195063 | orchestrator | 2025-03-22 23:29:14.195071 | orchestrator | TASK [include_role : swift] **************************************************** 2025-03-22 23:29:14.195082 | orchestrator | Saturday 22 March 2025 23:28:29 +0000 (0:00:02.908) 0:08:24.206 ******** 2025-03-22 23:29:14.195090 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195098 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195109 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.195117 | orchestrator | 2025-03-22 23:29:14.195126 | orchestrator | TASK [include_role : tacker] *************************************************** 2025-03-22 23:29:14.195134 | orchestrator | Saturday 22 March 2025 23:28:29 +0000 (0:00:00.480) 0:08:24.686 ******** 2025-03-22 23:29:14.195142 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195150 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195158 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.195166 | orchestrator | 2025-03-22 23:29:14.195174 | orchestrator | TASK [include_role : trove] **************************************************** 2025-03-22 23:29:14.195182 | orchestrator | Saturday 22 March 2025 23:28:30 +0000 (0:00:00.719) 0:08:25.406 ******** 2025-03-22 23:29:14.195190 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195198 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195205 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.195213 | orchestrator | 2025-03-22 23:29:14.195221 | orchestrator | TASK [include_role : venus] **************************************************** 2025-03-22 23:29:14.195229 | orchestrator | Saturday 22 March 2025 23:28:31 +0000 (0:00:00.674) 0:08:26.080 ******** 2025-03-22 23:29:14.195252 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195260 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195268 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.195276 | orchestrator | 2025-03-22 23:29:14.195285 | orchestrator | TASK [include_role : watcher] ************************************************** 2025-03-22 23:29:14.195293 | orchestrator | Saturday 22 March 2025 23:28:31 +0000 (0:00:00.669) 0:08:26.750 ******** 2025-03-22 23:29:14.195301 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195309 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195317 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.195325 | orchestrator | 2025-03-22 23:29:14.195333 | orchestrator | TASK [include_role : zun] ****************************************************** 2025-03-22 23:29:14.195341 | orchestrator | Saturday 22 March 2025 23:28:32 +0000 (0:00:00.369) 0:08:27.120 ******** 2025-03-22 23:29:14.195349 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195357 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195365 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.195373 | orchestrator | 2025-03-22 23:29:14.195381 | orchestrator | RUNNING HANDLER [loadbalancer : Check IP addresses on the API interface] ******* 2025-03-22 23:29:14.195389 | orchestrator | Saturday 22 March 2025 23:28:33 +0000 (0:00:01.195) 0:08:28.315 ******** 2025-03-22 23:29:14.195397 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.195405 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.195413 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.195421 | orchestrator | 2025-03-22 23:29:14.195429 | orchestrator | RUNNING HANDLER [loadbalancer : Group HA nodes by status] ********************** 2025-03-22 23:29:14.195437 | orchestrator | Saturday 22 March 2025 23:28:34 +0000 (0:00:01.138) 0:08:29.454 ******** 2025-03-22 23:29:14.195445 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.195453 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.195461 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.195482 | orchestrator | 2025-03-22 23:29:14.195490 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup keepalived container] ************** 2025-03-22 23:29:14.195499 | orchestrator | Saturday 22 March 2025 23:28:34 +0000 (0:00:00.446) 0:08:29.900 ******** 2025-03-22 23:29:14.195507 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.195515 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.195524 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.195546 | orchestrator | 2025-03-22 23:29:14.195555 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup haproxy container] ***************** 2025-03-22 23:29:14.195563 | orchestrator | Saturday 22 March 2025 23:28:36 +0000 (0:00:01.414) 0:08:31.315 ******** 2025-03-22 23:29:14.195571 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.195579 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.195587 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.195595 | orchestrator | 2025-03-22 23:29:14.195603 | orchestrator | RUNNING HANDLER [loadbalancer : Stop backup proxysql container] **************** 2025-03-22 23:29:14.195611 | orchestrator | Saturday 22 March 2025 23:28:37 +0000 (0:00:01.331) 0:08:32.647 ******** 2025-03-22 23:29:14.195620 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.195628 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.195636 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.195644 | orchestrator | 2025-03-22 23:29:14.195652 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup haproxy container] **************** 2025-03-22 23:29:14.195660 | orchestrator | Saturday 22 March 2025 23:28:38 +0000 (0:00:01.046) 0:08:33.693 ******** 2025-03-22 23:29:14.195668 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.195676 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.195684 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.195692 | orchestrator | 2025-03-22 23:29:14.195701 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for backup haproxy to start] ************** 2025-03-22 23:29:14.195709 | orchestrator | Saturday 22 March 2025 23:28:45 +0000 (0:00:06.692) 0:08:40.385 ******** 2025-03-22 23:29:14.195717 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.195738 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.195747 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.195755 | orchestrator | 2025-03-22 23:29:14.195764 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup proxysql container] *************** 2025-03-22 23:29:14.195772 | orchestrator | Saturday 22 March 2025 23:28:48 +0000 (0:00:03.196) 0:08:43.582 ******** 2025-03-22 23:29:14.195780 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.195788 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.195796 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.195804 | orchestrator | 2025-03-22 23:29:14.195812 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for backup proxysql to start] ************* 2025-03-22 23:29:14.195820 | orchestrator | Saturday 22 March 2025 23:28:56 +0000 (0:00:07.878) 0:08:51.460 ******** 2025-03-22 23:29:14.195828 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.195836 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.195844 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.195852 | orchestrator | 2025-03-22 23:29:14.195860 | orchestrator | RUNNING HANDLER [loadbalancer : Start backup keepalived container] ************* 2025-03-22 23:29:14.195868 | orchestrator | Saturday 22 March 2025 23:29:00 +0000 (0:00:03.818) 0:08:55.279 ******** 2025-03-22 23:29:14.195876 | orchestrator | changed: [testbed-node-0] 2025-03-22 23:29:14.195884 | orchestrator | changed: [testbed-node-2] 2025-03-22 23:29:14.195892 | orchestrator | changed: [testbed-node-1] 2025-03-22 23:29:14.195900 | orchestrator | 2025-03-22 23:29:14.195911 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master haproxy container] ***************** 2025-03-22 23:29:14.195923 | orchestrator | Saturday 22 March 2025 23:29:05 +0000 (0:00:05.310) 0:09:00.589 ******** 2025-03-22 23:29:14.195931 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195939 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195947 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.195955 | orchestrator | 2025-03-22 23:29:14.195963 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master proxysql container] **************** 2025-03-22 23:29:14.195971 | orchestrator | Saturday 22 March 2025 23:29:06 +0000 (0:00:00.736) 0:09:01.326 ******** 2025-03-22 23:29:14.195979 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.195987 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.195995 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.196003 | orchestrator | 2025-03-22 23:29:14.196011 | orchestrator | RUNNING HANDLER [loadbalancer : Stop master keepalived container] ************** 2025-03-22 23:29:14.196019 | orchestrator | Saturday 22 March 2025 23:29:07 +0000 (0:00:00.777) 0:09:02.104 ******** 2025-03-22 23:29:14.196027 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.196035 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.196043 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.196051 | orchestrator | 2025-03-22 23:29:14.196059 | orchestrator | RUNNING HANDLER [loadbalancer : Start master haproxy container] **************** 2025-03-22 23:29:14.196067 | orchestrator | Saturday 22 March 2025 23:29:07 +0000 (0:00:00.404) 0:09:02.508 ******** 2025-03-22 23:29:14.196075 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.196083 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.196091 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.196099 | orchestrator | 2025-03-22 23:29:14.196107 | orchestrator | RUNNING HANDLER [loadbalancer : Start master proxysql container] *************** 2025-03-22 23:29:14.196115 | orchestrator | Saturday 22 March 2025 23:29:08 +0000 (0:00:00.694) 0:09:03.203 ******** 2025-03-22 23:29:14.196123 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.196131 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.196139 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.196147 | orchestrator | 2025-03-22 23:29:14.196155 | orchestrator | RUNNING HANDLER [loadbalancer : Start master keepalived container] ************* 2025-03-22 23:29:14.196163 | orchestrator | Saturday 22 March 2025 23:29:08 +0000 (0:00:00.731) 0:09:03.934 ******** 2025-03-22 23:29:14.196171 | orchestrator | skipping: [testbed-node-0] 2025-03-22 23:29:14.196179 | orchestrator | skipping: [testbed-node-1] 2025-03-22 23:29:14.196201 | orchestrator | skipping: [testbed-node-2] 2025-03-22 23:29:14.196209 | orchestrator | 2025-03-22 23:29:14.196217 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for haproxy to listen on VIP] ************* 2025-03-22 23:29:14.196225 | orchestrator | Saturday 22 March 2025 23:29:09 +0000 (0:00:00.488) 0:09:04.423 ******** 2025-03-22 23:29:14.196233 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.196241 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.196249 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.196257 | orchestrator | 2025-03-22 23:29:14.196265 | orchestrator | RUNNING HANDLER [loadbalancer : Wait for proxysql to listen on VIP] ************ 2025-03-22 23:29:14.196273 | orchestrator | Saturday 22 March 2025 23:29:10 +0000 (0:00:01.446) 0:09:05.869 ******** 2025-03-22 23:29:14.196281 | orchestrator | ok: [testbed-node-0] 2025-03-22 23:29:14.196289 | orchestrator | ok: [testbed-node-1] 2025-03-22 23:29:14.196297 | orchestrator | ok: [testbed-node-2] 2025-03-22 23:29:14.196305 | orchestrator | 2025-03-22 23:29:14.196313 | orchestrator | PLAY RECAP ********************************************************************* 2025-03-22 23:29:14.196321 | orchestrator | testbed-node-0 : ok=127  changed=79  unreachable=0 failed=0 skipped=92  rescued=0 ignored=0 2025-03-22 23:29:14.196330 | orchestrator | testbed-node-1 : ok=126  changed=79  unreachable=0 failed=0 skipped=92  rescued=0 ignored=0 2025-03-22 23:29:14.196338 | orchestrator | testbed-node-2 : ok=126  changed=79  unreachable=0 failed=0 skipped=92  rescued=0 ignored=0 2025-03-22 23:29:14.196346 | orchestrator | 2025-03-22 23:29:14.196354 | orchestrator | 2025-03-22 23:29:14.196362 | orchestrator | TASKS RECAP ******************************************************************** 2025-03-22 23:29:14.196370 | orchestrator | Saturday 22 March 2025 23:29:12 +0000 (0:00:01.249) 0:09:07.118 ******** 2025-03-22 23:29:14.196378 | orchestrator | =============================================================================== 2025-03-22 23:29:14.196386 | orchestrator | haproxy-config : Copying over skyline haproxy config ------------------- 10.47s 2025-03-22 23:29:14.196394 | orchestrator | haproxy-config : Copying over glance haproxy config --------------------- 9.77s 2025-03-22 23:29:14.196402 | orchestrator | haproxy-config : Copying over heat haproxy config ----------------------- 8.64s 2025-03-22 23:29:14.196410 | orchestrator | haproxy-config : Configuring firewall for glance ------------------------ 8.51s 2025-03-22 23:29:14.196418 | orchestrator | haproxy-config : Copying over opensearch haproxy config ----------------- 8.36s 2025-03-22 23:29:14.196426 | orchestrator | loadbalancer : Start backup proxysql container -------------------------- 7.88s 2025-03-22 23:29:14.196434 | orchestrator | haproxy-config : Copying over aodh haproxy config ----------------------- 7.61s 2025-03-22 23:29:14.196442 | orchestrator | loadbalancer : Copying checks for services which are enabled ------------ 7.05s 2025-03-22 23:29:14.196450 | orchestrator | haproxy-config : Add configuration for glance when using single external frontend --- 6.91s 2025-03-22 23:29:14.196458 | orchestrator | haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config --- 6.81s 2025-03-22 23:29:14.196466 | orchestrator | haproxy-config : Copying over nova haproxy config ----------------------- 6.78s 2025-03-22 23:29:14.196473 | orchestrator | loadbalancer : Start backup haproxy container --------------------------- 6.69s 2025-03-22 23:29:14.196481 | orchestrator | haproxy-config : Copying over neutron haproxy config -------------------- 6.43s 2025-03-22 23:29:14.196493 | orchestrator | loadbalancer : Ensuring proxysql service config subdirectories exist ---- 6.43s 2025-03-22 23:29:14.196503 | orchestrator | haproxy-config : Copying over barbican haproxy config ------------------- 6.36s 2025-03-22 23:29:17.235691 | orchestrator | haproxy-config : Copying over designate haproxy config ------------------ 6.22s 2025-03-22 23:29:17.235792 | orchestrator | loadbalancer : Copying over proxysql config ----------------------------- 6.05s 2025-03-22 23:29:17.235808 | orchestrator | haproxy-config : Copying over keystone haproxy config ------------------- 5.93s 2025-03-22 23:29:17.235822 | orchestrator | haproxy-config : Copying over magnum haproxy config --------------------- 5.93s 2025-03-22 23:29:17.235862 | orchestrator | haproxy-config : Copying over prometheus haproxy config ----------------- 5.89s 2025-03-22 23:29:17.235876 | orchestrator | 2025-03-22 23:29:14 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:17.235889 | orchestrator | 2025-03-22 23:29:14 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:17.235902 | orchestrator | 2025-03-22 23:29:14 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:17.235931 | orchestrator | 2025-03-22 23:29:17 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:17.236568 | orchestrator | 2025-03-22 23:29:17 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:17.237636 | orchestrator | 2025-03-22 23:29:17 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:17.239420 | orchestrator | 2025-03-22 23:29:17 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:20.291829 | orchestrator | 2025-03-22 23:29:17 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:20.291971 | orchestrator | 2025-03-22 23:29:20 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:20.300236 | orchestrator | 2025-03-22 23:29:20 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:20.300265 | orchestrator | 2025-03-22 23:29:20 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:20.300287 | orchestrator | 2025-03-22 23:29:20 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:23.354618 | orchestrator | 2025-03-22 23:29:20 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:23.354735 | orchestrator | 2025-03-22 23:29:23 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:23.358421 | orchestrator | 2025-03-22 23:29:23 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:23.361141 | orchestrator | 2025-03-22 23:29:23 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:23.361173 | orchestrator | 2025-03-22 23:29:23 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:23.361464 | orchestrator | 2025-03-22 23:29:23 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:26.426794 | orchestrator | 2025-03-22 23:29:26 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:26.427235 | orchestrator | 2025-03-22 23:29:26 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:26.428291 | orchestrator | 2025-03-22 23:29:26 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:26.429434 | orchestrator | 2025-03-22 23:29:26 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:29.478425 | orchestrator | 2025-03-22 23:29:26 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:29.478597 | orchestrator | 2025-03-22 23:29:29 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:29.483014 | orchestrator | 2025-03-22 23:29:29 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:29.483928 | orchestrator | 2025-03-22 23:29:29 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:29.483963 | orchestrator | 2025-03-22 23:29:29 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:32.532764 | orchestrator | 2025-03-22 23:29:29 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:32.532921 | orchestrator | 2025-03-22 23:29:32 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:32.535609 | orchestrator | 2025-03-22 23:29:32 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:32.536882 | orchestrator | 2025-03-22 23:29:32 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:32.538169 | orchestrator | 2025-03-22 23:29:32 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:32.540506 | orchestrator | 2025-03-22 23:29:32 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:35.598222 | orchestrator | 2025-03-22 23:29:35 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:35.606758 | orchestrator | 2025-03-22 23:29:35 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:38.659942 | orchestrator | 2025-03-22 23:29:35 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:38.660080 | orchestrator | 2025-03-22 23:29:35 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:38.660110 | orchestrator | 2025-03-22 23:29:35 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:38.660143 | orchestrator | 2025-03-22 23:29:38 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:38.660385 | orchestrator | 2025-03-22 23:29:38 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:38.662771 | orchestrator | 2025-03-22 23:29:38 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:38.670226 | orchestrator | 2025-03-22 23:29:38 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:41.710917 | orchestrator | 2025-03-22 23:29:38 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:41.711040 | orchestrator | 2025-03-22 23:29:41 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:41.711418 | orchestrator | 2025-03-22 23:29:41 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:41.711449 | orchestrator | 2025-03-22 23:29:41 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:41.715766 | orchestrator | 2025-03-22 23:29:41 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:44.758994 | orchestrator | 2025-03-22 23:29:41 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:44.759132 | orchestrator | 2025-03-22 23:29:44 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:44.761018 | orchestrator | 2025-03-22 23:29:44 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:44.761062 | orchestrator | 2025-03-22 23:29:44 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:44.764741 | orchestrator | 2025-03-22 23:29:44 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:47.813007 | orchestrator | 2025-03-22 23:29:44 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:47.813130 | orchestrator | 2025-03-22 23:29:47 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:47.813748 | orchestrator | 2025-03-22 23:29:47 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:47.813781 | orchestrator | 2025-03-22 23:29:47 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:47.814818 | orchestrator | 2025-03-22 23:29:47 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:50.881102 | orchestrator | 2025-03-22 23:29:47 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:50.881243 | orchestrator | 2025-03-22 23:29:50 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:53.928600 | orchestrator | 2025-03-22 23:29:50 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:53.928715 | orchestrator | 2025-03-22 23:29:50 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:53.928733 | orchestrator | 2025-03-22 23:29:50 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:53.928771 | orchestrator | 2025-03-22 23:29:50 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:53.928804 | orchestrator | 2025-03-22 23:29:53 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:53.932900 | orchestrator | 2025-03-22 23:29:53 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:53.937880 | orchestrator | 2025-03-22 23:29:53 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:53.943390 | orchestrator | 2025-03-22 23:29:53 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:57.017239 | orchestrator | 2025-03-22 23:29:53 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:29:57.017380 | orchestrator | 2025-03-22 23:29:57 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:29:57.024611 | orchestrator | 2025-03-22 23:29:57 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:29:57.031901 | orchestrator | 2025-03-22 23:29:57 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:29:57.036221 | orchestrator | 2025-03-22 23:29:57 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:29:57.036585 | orchestrator | 2025-03-22 23:29:57 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:00.092845 | orchestrator | 2025-03-22 23:30:00 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:00.093185 | orchestrator | 2025-03-22 23:30:00 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:00.095778 | orchestrator | 2025-03-22 23:30:00 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:00.096494 | orchestrator | 2025-03-22 23:30:00 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:03.141671 | orchestrator | 2025-03-22 23:30:00 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:03.141795 | orchestrator | 2025-03-22 23:30:03 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:03.142877 | orchestrator | 2025-03-22 23:30:03 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:03.147402 | orchestrator | 2025-03-22 23:30:03 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:03.149126 | orchestrator | 2025-03-22 23:30:03 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:03.149235 | orchestrator | 2025-03-22 23:30:03 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:06.203290 | orchestrator | 2025-03-22 23:30:06 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:06.205908 | orchestrator | 2025-03-22 23:30:06 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:06.207239 | orchestrator | 2025-03-22 23:30:06 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:06.207279 | orchestrator | 2025-03-22 23:30:06 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:09.253346 | orchestrator | 2025-03-22 23:30:06 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:09.253479 | orchestrator | 2025-03-22 23:30:09 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:09.258980 | orchestrator | 2025-03-22 23:30:09 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:09.260727 | orchestrator | 2025-03-22 23:30:09 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:09.260794 | orchestrator | 2025-03-22 23:30:09 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:12.314800 | orchestrator | 2025-03-22 23:30:09 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:12.314934 | orchestrator | 2025-03-22 23:30:12 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:12.320404 | orchestrator | 2025-03-22 23:30:12 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:12.322124 | orchestrator | 2025-03-22 23:30:12 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:12.322159 | orchestrator | 2025-03-22 23:30:12 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:15.369703 | orchestrator | 2025-03-22 23:30:12 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:15.369837 | orchestrator | 2025-03-22 23:30:15 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:15.370327 | orchestrator | 2025-03-22 23:30:15 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:15.372508 | orchestrator | 2025-03-22 23:30:15 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:15.374245 | orchestrator | 2025-03-22 23:30:15 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:18.423676 | orchestrator | 2025-03-22 23:30:15 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:18.423800 | orchestrator | 2025-03-22 23:30:18 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:18.425973 | orchestrator | 2025-03-22 23:30:18 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:18.427765 | orchestrator | 2025-03-22 23:30:18 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:18.428864 | orchestrator | 2025-03-22 23:30:18 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:21.493838 | orchestrator | 2025-03-22 23:30:18 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:21.493966 | orchestrator | 2025-03-22 23:30:21 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:21.494593 | orchestrator | 2025-03-22 23:30:21 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:21.494631 | orchestrator | 2025-03-22 23:30:21 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:21.501522 | orchestrator | 2025-03-22 23:30:21 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:24.558223 | orchestrator | 2025-03-22 23:30:21 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:24.558376 | orchestrator | 2025-03-22 23:30:24 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:24.562164 | orchestrator | 2025-03-22 23:30:24 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:24.564563 | orchestrator | 2025-03-22 23:30:24 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:24.566101 | orchestrator | 2025-03-22 23:30:24 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:24.566216 | orchestrator | 2025-03-22 23:30:24 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:27.625682 | orchestrator | 2025-03-22 23:30:27 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:27.628056 | orchestrator | 2025-03-22 23:30:27 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:27.629964 | orchestrator | 2025-03-22 23:30:27 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:27.631404 | orchestrator | 2025-03-22 23:30:27 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:30.674989 | orchestrator | 2025-03-22 23:30:27 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:30.675120 | orchestrator | 2025-03-22 23:30:30 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:30.676863 | orchestrator | 2025-03-22 23:30:30 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:30.678153 | orchestrator | 2025-03-22 23:30:30 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:30.679273 | orchestrator | 2025-03-22 23:30:30 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:33.730660 | orchestrator | 2025-03-22 23:30:30 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:33.730794 | orchestrator | 2025-03-22 23:30:33 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:33.732509 | orchestrator | 2025-03-22 23:30:33 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:33.732571 | orchestrator | 2025-03-22 23:30:33 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:33.733905 | orchestrator | 2025-03-22 23:30:33 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:36.787771 | orchestrator | 2025-03-22 23:30:33 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:36.787900 | orchestrator | 2025-03-22 23:30:36 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:36.789093 | orchestrator | 2025-03-22 23:30:36 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:36.790588 | orchestrator | 2025-03-22 23:30:36 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:36.791808 | orchestrator | 2025-03-22 23:30:36 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:39.851296 | orchestrator | 2025-03-22 23:30:36 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:39.851432 | orchestrator | 2025-03-22 23:30:39 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:39.852766 | orchestrator | 2025-03-22 23:30:39 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:39.855850 | orchestrator | 2025-03-22 23:30:39 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:39.858077 | orchestrator | 2025-03-22 23:30:39 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:42.908981 | orchestrator | 2025-03-22 23:30:39 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:42.909143 | orchestrator | 2025-03-22 23:30:42 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:42.914519 | orchestrator | 2025-03-22 23:30:42 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:42.916966 | orchestrator | 2025-03-22 23:30:42 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:42.918665 | orchestrator | 2025-03-22 23:30:42 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:45.959600 | orchestrator | 2025-03-22 23:30:42 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:45.959728 | orchestrator | 2025-03-22 23:30:45 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:45.961007 | orchestrator | 2025-03-22 23:30:45 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:45.962597 | orchestrator | 2025-03-22 23:30:45 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:45.965870 | orchestrator | 2025-03-22 23:30:45 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:49.017257 | orchestrator | 2025-03-22 23:30:45 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:49.017367 | orchestrator | 2025-03-22 23:30:49 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:49.020694 | orchestrator | 2025-03-22 23:30:49 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:49.022952 | orchestrator | 2025-03-22 23:30:49 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:49.024828 | orchestrator | 2025-03-22 23:30:49 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:52.069874 | orchestrator | 2025-03-22 23:30:49 | INFO  | Wait 1 second(s) until the next check 2025-03-22 23:30:52.069998 | orchestrator | 2025-03-22 23:30:52 | INFO  | Task dae9416f-7b86-4a73-a36d-e40c86da8e45 is in state STARTED 2025-03-22 23:30:52.071630 | orchestrator | 2025-03-22 23:30:52 | INFO  | Task a0acdba7-ece3-4524-8358-03c7dcbf9399 is in state STARTED 2025-03-22 23:30:52.075015 | orchestrator | 2025-03-22 23:30:52 | INFO  | Task 2af155cb-2cbc-4944-aa5d-8b90d5ba549f is in state STARTED 2025-03-22 23:30:52.079707 | orchestrator | 2025-03-22 23:30:52 | INFO  | Task 1624f071-cf95-4c64-9d5a-3c1b7bf4493d is in state STARTED 2025-03-22 23:30:53.184098 | RUN END RESULT_TIMED_OUT: [untrusted : github.com/osism/testbed/playbooks/deploy.yml@main] 2025-03-22 23:30:53.188715 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-03-22 23:30:53.912793 | 2025-03-22 23:30:53.912955 | PLAY [Post output play] 2025-03-22 23:30:53.941949 | 2025-03-22 23:30:53.942084 | LOOP [stage-output : Register sources] 2025-03-22 23:30:54.021692 | 2025-03-22 23:30:54.021952 | TASK [stage-output : Check sudo] 2025-03-22 23:30:54.696522 | orchestrator | sudo: a password is required 2025-03-22 23:30:55.066055 | orchestrator | ok: Runtime: 0:00:00.014984 2025-03-22 23:30:55.083162 | 2025-03-22 23:30:55.083368 | LOOP [stage-output : Set source and destination for files and folders] 2025-03-22 23:30:55.125431 | 2025-03-22 23:30:55.125683 | TASK [stage-output : Build a list of source, dest dictionaries] 2025-03-22 23:30:55.208939 | orchestrator | ok 2025-03-22 23:30:55.217443 | 2025-03-22 23:30:55.217560 | LOOP [stage-output : Ensure target folders exist] 2025-03-22 23:30:55.707505 | orchestrator | ok: "docs" 2025-03-22 23:30:55.707901 | 2025-03-22 23:30:55.922326 | orchestrator | ok: "artifacts" 2025-03-22 23:30:56.169721 | orchestrator | ok: "logs" 2025-03-22 23:30:56.190361 | 2025-03-22 23:30:56.190528 | LOOP [stage-output : Copy files and folders to staging folder] 2025-03-22 23:30:56.231493 | 2025-03-22 23:30:56.231759 | TASK [stage-output : Make all log files readable] 2025-03-22 23:30:56.501761 | orchestrator | ok 2025-03-22 23:30:56.509635 | 2025-03-22 23:30:56.509750 | TASK [stage-output : Rename log files that match extensions_to_txt] 2025-03-22 23:30:56.555260 | orchestrator | skipping: Conditional result was False 2025-03-22 23:30:56.572601 | 2025-03-22 23:30:56.572780 | TASK [stage-output : Discover log files for compression] 2025-03-22 23:30:56.609265 | orchestrator | skipping: Conditional result was False 2025-03-22 23:30:56.627936 | 2025-03-22 23:30:56.628088 | LOOP [stage-output : Archive everything from logs] 2025-03-22 23:30:56.706174 | 2025-03-22 23:30:56.706415 | PLAY [Post cleanup play] 2025-03-22 23:30:56.730345 | 2025-03-22 23:30:56.730458 | TASK [Set cloud fact (Zuul deployment)] 2025-03-22 23:30:56.799094 | orchestrator | ok 2025-03-22 23:30:56.810777 | 2025-03-22 23:30:56.810888 | TASK [Set cloud fact (local deployment)] 2025-03-22 23:30:56.848138 | orchestrator | skipping: Conditional result was False 2025-03-22 23:30:56.866341 | 2025-03-22 23:30:56.866499 | TASK [Clean the cloud environment] 2025-03-22 23:30:57.510253 | orchestrator | 2025-03-22 23:30:57 - clean up servers 2025-03-22 23:30:58.340283 | orchestrator | 2025-03-22 23:30:58 - testbed-manager 2025-03-22 23:30:58.434825 | orchestrator | 2025-03-22 23:30:58 - testbed-node-3 2025-03-22 23:30:58.527645 | orchestrator | 2025-03-22 23:30:58 - testbed-node-0 2025-03-22 23:30:58.618997 | orchestrator | 2025-03-22 23:30:58 - testbed-node-2 2025-03-22 23:30:58.705247 | orchestrator | 2025-03-22 23:30:58 - testbed-node-4 2025-03-22 23:30:58.816206 | orchestrator | 2025-03-22 23:30:58 - testbed-node-5 2025-03-22 23:30:58.908718 | orchestrator | 2025-03-22 23:30:58 - testbed-node-1 2025-03-22 23:30:59.000138 | orchestrator | 2025-03-22 23:30:58 - clean up keypairs 2025-03-22 23:30:59.018337 | orchestrator | 2025-03-22 23:30:59 - testbed 2025-03-22 23:30:59.053786 | orchestrator | 2025-03-22 23:30:59 - wait for servers to be gone 2025-03-22 23:31:12.526285 | orchestrator | 2025-03-22 23:31:12 - clean up ports 2025-03-22 23:31:12.717131 | orchestrator | 2025-03-22 23:31:12 - 008a2b55-616b-48c1-97c9-c2abcfcd2feb 2025-03-22 23:31:12.904476 | orchestrator | 2025-03-22 23:31:12 - 2aab26da-cbfa-478c-b70e-9d26083f2a32 2025-03-22 23:31:13.106961 | orchestrator | 2025-03-22 23:31:13 - 5bbd6255-0712-4443-b855-56173551c8a6 2025-03-22 23:31:13.347132 | orchestrator | 2025-03-22 23:31:13 - 8288d4ef-f8ca-45e5-9861-b259c653c846 2025-03-22 23:31:13.527078 | orchestrator | 2025-03-22 23:31:13 - 9a373260-f4a9-48a3-8e17-72a8fd156a46 2025-03-22 23:31:13.723184 | orchestrator | 2025-03-22 23:31:13 - dea9137d-e420-47df-b702-87c71c47428d 2025-03-22 23:31:14.049609 | orchestrator | 2025-03-22 23:31:14 - e8504ac8-6108-487e-9fab-d053bc6b73f7 2025-03-22 23:31:14.232688 | orchestrator | 2025-03-22 23:31:14 - clean up volumes 2025-03-22 23:31:14.380620 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-3-node-base 2025-03-22 23:31:14.420697 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-4-node-base 2025-03-22 23:31:14.458915 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-5-node-base 2025-03-22 23:31:14.499823 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-1-node-base 2025-03-22 23:31:14.541963 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-0-node-base 2025-03-22 23:31:14.585396 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-2-node-base 2025-03-22 23:31:14.631915 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-manager-base 2025-03-22 23:31:14.673879 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-9-node-3 2025-03-22 23:31:14.714788 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-12-node-0 2025-03-22 23:31:14.755005 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-16-node-4 2025-03-22 23:31:14.792922 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-5-node-5 2025-03-22 23:31:14.833851 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-10-node-4 2025-03-22 23:31:14.873591 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-3-node-3 2025-03-22 23:31:14.912090 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-7-node-1 2025-03-22 23:31:14.952252 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-1-node-1 2025-03-22 23:31:14.989438 | orchestrator | 2025-03-22 23:31:14 - testbed-volume-14-node-2 2025-03-22 23:31:15.030489 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-2-node-2 2025-03-22 23:31:15.068560 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-15-node-3 2025-03-22 23:31:15.108885 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-11-node-5 2025-03-22 23:31:15.147572 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-4-node-4 2025-03-22 23:31:15.190662 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-13-node-1 2025-03-22 23:31:15.230196 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-6-node-0 2025-03-22 23:31:15.272398 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-8-node-2 2025-03-22 23:31:15.320756 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-0-node-0 2025-03-22 23:31:15.364489 | orchestrator | 2025-03-22 23:31:15 - testbed-volume-17-node-5 2025-03-22 23:31:15.409109 | orchestrator | 2025-03-22 23:31:15 - disconnect routers 2025-03-22 23:31:15.467413 | orchestrator | 2025-03-22 23:31:15 - testbed 2025-03-22 23:31:16.279340 | orchestrator | 2025-03-22 23:31:16 - clean up subnets 2025-03-22 23:31:16.313047 | orchestrator | 2025-03-22 23:31:16 - subnet-testbed-management 2025-03-22 23:31:16.443028 | orchestrator | 2025-03-22 23:31:16 - clean up networks 2025-03-22 23:31:16.591417 | orchestrator | 2025-03-22 23:31:16 - net-testbed-management 2025-03-22 23:31:16.832017 | orchestrator | 2025-03-22 23:31:16 - clean up security groups 2025-03-22 23:31:16.861610 | orchestrator | 2025-03-22 23:31:16 - testbed-node 2025-03-22 23:31:16.942747 | orchestrator | 2025-03-22 23:31:16 - testbed-management 2025-03-22 23:31:17.025066 | orchestrator | 2025-03-22 23:31:17 - clean up floating ips 2025-03-22 23:31:17.049677 | orchestrator | 2025-03-22 23:31:17 - 81.163.192.83 2025-03-22 23:31:17.424961 | orchestrator | 2025-03-22 23:31:17 - clean up routers 2025-03-22 23:31:17.470585 | orchestrator | 2025-03-22 23:31:17 - testbed 2025-03-22 23:31:18.424492 | orchestrator | changed 2025-03-22 23:31:18.468889 | 2025-03-22 23:31:18.468990 | PLAY RECAP 2025-03-22 23:31:18.469043 | orchestrator | ok: 6 changed: 2 unreachable: 0 failed: 0 skipped: 7 rescued: 0 ignored: 0 2025-03-22 23:31:18.469068 | 2025-03-22 23:31:18.589794 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-03-22 23:31:18.596154 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-03-22 23:31:19.285823 | 2025-03-22 23:31:19.285975 | PLAY [Base post-fetch] 2025-03-22 23:31:19.316716 | 2025-03-22 23:31:19.316863 | TASK [fetch-output : Set log path for multiple nodes] 2025-03-22 23:31:19.394548 | orchestrator | skipping: Conditional result was False 2025-03-22 23:31:19.409792 | 2025-03-22 23:31:19.409960 | TASK [fetch-output : Set log path for single node] 2025-03-22 23:31:19.472721 | orchestrator | ok 2025-03-22 23:31:19.482765 | 2025-03-22 23:31:19.482882 | LOOP [fetch-output : Ensure local output dirs] 2025-03-22 23:31:19.974929 | orchestrator -> localhost | ok: "/var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/work/logs" 2025-03-22 23:31:20.243798 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/work/artifacts" 2025-03-22 23:31:20.508211 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/0ca07ad8f44c4220a2f2e2b88aef6038/work/docs" 2025-03-22 23:31:20.526482 | 2025-03-22 23:31:20.526599 | LOOP [fetch-output : Collect logs, artifacts and docs] 2025-03-22 23:31:21.323831 | orchestrator | changed: .d..t...... ./ 2025-03-22 23:31:21.324178 | orchestrator | changed: All items complete 2025-03-22 23:31:21.324232 | 2025-03-22 23:31:21.901366 | orchestrator | changed: .d..t...... ./ 2025-03-22 23:31:22.506025 | orchestrator | changed: .d..t...... ./ 2025-03-22 23:31:22.538615 | 2025-03-22 23:31:22.538734 | LOOP [merge-output-to-logs : Move artifacts and docs to logs dir] 2025-03-22 23:31:22.584601 | orchestrator | skipping: Conditional result was False 2025-03-22 23:31:22.598629 | orchestrator | skipping: Conditional result was False 2025-03-22 23:31:22.653410 | 2025-03-22 23:31:22.653504 | PLAY RECAP 2025-03-22 23:31:22.653555 | orchestrator | ok: 3 changed: 2 unreachable: 0 failed: 0 skipped: 2 rescued: 0 ignored: 0 2025-03-22 23:31:22.653582 | 2025-03-22 23:31:22.773040 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-03-22 23:31:22.783592 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-03-22 23:31:23.555037 | 2025-03-22 23:31:23.555277 | PLAY [Base post] 2025-03-22 23:31:23.595015 | 2025-03-22 23:31:23.595213 | TASK [remove-build-sshkey : Remove the build SSH key from all nodes] 2025-03-22 23:31:24.408335 | orchestrator | changed 2025-03-22 23:31:24.447248 | 2025-03-22 23:31:24.447382 | PLAY RECAP 2025-03-22 23:31:24.447489 | orchestrator | ok: 1 changed: 1 unreachable: 0 failed: 0 skipped: 0 rescued: 0 ignored: 0 2025-03-22 23:31:24.447566 | 2025-03-22 23:31:24.564504 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-03-22 23:31:24.569744 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-logs.yaml@main] 2025-03-22 23:31:25.320830 | 2025-03-22 23:31:25.320990 | PLAY [Base post-logs] 2025-03-22 23:31:25.337100 | 2025-03-22 23:31:25.337233 | TASK [generate-zuul-manifest : Generate Zuul manifest] 2025-03-22 23:31:25.850528 | localhost | changed 2025-03-22 23:31:25.857704 | 2025-03-22 23:31:25.857848 | TASK [generate-zuul-manifest : Return Zuul manifest URL to Zuul] 2025-03-22 23:31:25.889197 | localhost | ok 2025-03-22 23:31:25.900020 | 2025-03-22 23:31:25.900124 | TASK [Set zuul-log-path fact] 2025-03-22 23:31:25.930547 | localhost | ok 2025-03-22 23:31:25.946219 | 2025-03-22 23:31:25.946408 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-03-22 23:31:25.977596 | localhost | ok 2025-03-22 23:31:25.984998 | 2025-03-22 23:31:25.985112 | TASK [upload-logs : Create log directories] 2025-03-22 23:31:26.485280 | localhost | changed 2025-03-22 23:31:26.489999 | 2025-03-22 23:31:26.490109 | TASK [upload-logs : Ensure logs are readable before uploading] 2025-03-22 23:31:26.978568 | localhost -> localhost | ok: Runtime: 0:00:00.005542 2025-03-22 23:31:26.988013 | 2025-03-22 23:31:26.988159 | TASK [upload-logs : Upload logs to log server] 2025-03-22 23:31:27.580566 | localhost | Output suppressed because no_log was given 2025-03-22 23:31:27.584197 | 2025-03-22 23:31:27.584527 | LOOP [upload-logs : Compress console log and json output] 2025-03-22 23:31:27.655850 | localhost | skipping: Conditional result was False 2025-03-22 23:31:27.674157 | localhost | skipping: Conditional result was False 2025-03-22 23:31:27.691246 | 2025-03-22 23:31:27.691421 | LOOP [upload-logs : Upload compressed console log and json output] 2025-03-22 23:31:27.756635 | localhost | skipping: Conditional result was False 2025-03-22 23:31:27.756992 | 2025-03-22 23:31:27.777054 | localhost | skipping: Conditional result was False 2025-03-22 23:31:27.784991 | 2025-03-22 23:31:27.785155 | LOOP [upload-logs : Upload console log and json output]