Apr 17 14:18:17.653659 ip-10-0-132-119 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 14:18:17.653664 ip-10-0-132-119 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 14:18:17.653672 ip-10-0-132-119 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 14:18:17.653950 ip-10-0-132-119 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 14:18:27.716561 ip-10-0-132-119 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 14:18:27.716575 ip-10-0-132-119 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ece3f4bb3e5c414e9639d7816e942508 -- Apr 17 14:20:44.759558 ip-10-0-132-119 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:20:45.156096 ip-10-0-132-119 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:20:45.156096 ip-10-0-132-119 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:20:45.156096 ip-10-0-132-119 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:20:45.156096 ip-10-0-132-119 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:20:45.156096 ip-10-0-132-119 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:20:45.157510 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.157425 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:20:45.160486 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160472 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:20:45.160486 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160486 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160491 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160494 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160498 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160501 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160504 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160508 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160511 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160514 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160517 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160520 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160534 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160537 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160539 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160542 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160544 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160547 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160549 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160552 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:20:45.160545 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160555 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160558 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160560 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160563 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160565 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160568 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160570 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160573 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160575 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160577 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160580 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160582 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160584 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160587 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160589 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160591 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160596 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160599 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160602 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:20:45.160992 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160604 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160606 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160608 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160611 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160613 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160616 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160618 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160621 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160623 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160627 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160630 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160633 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160636 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160639 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160641 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160645 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160648 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160651 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160653 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160656 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:20:45.161468 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160659 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160662 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160664 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160667 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160670 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160672 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160675 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160677 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160679 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160682 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160685 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160687 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160690 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160692 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160695 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160697 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160700 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160702 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160705 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160708 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:20:45.161975 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160710 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160713 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160715 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160718 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160720 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160723 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.160726 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161099 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161104 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161107 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161110 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161115 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161118 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161121 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161124 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161127 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161130 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161133 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161136 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:20:45.162465 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161139 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161141 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161144 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161147 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161149 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161151 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161154 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161156 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161159 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161169 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161172 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161174 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161178 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161180 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161183 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161185 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161188 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161191 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161193 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161197 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:20:45.162947 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161199 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161202 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161205 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161207 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161210 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161212 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161215 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161217 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161220 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161223 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161226 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161228 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161231 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161233 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161236 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161238 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161241 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161243 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161245 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161248 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:20:45.163472 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161250 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161252 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161255 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161257 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161260 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161262 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161265 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161281 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161285 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161287 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161291 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161293 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161296 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161299 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161302 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161304 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161308 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161310 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161312 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:20:45.163953 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161315 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161318 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161320 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161323 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161326 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161329 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161331 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161334 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161336 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161338 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161341 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161343 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161345 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161348 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.161350 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161421 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161430 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161439 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161445 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161457 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161460 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161466 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:20:45.164424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161470 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161473 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161476 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161480 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161483 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161486 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161489 2577 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161492 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161495 2577 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161497 2577 flags.go:64] FLAG: --cloud-config="" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161500 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161503 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161507 2577 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161510 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161513 2577 flags.go:64] FLAG: --config-dir="" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161516 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161519 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161523 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161526 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161529 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161532 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161535 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161538 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161541 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161544 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:20:45.164938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161546 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161551 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161553 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161556 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161559 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161563 2577 flags.go:64] FLAG: --enable-server="true" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161568 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161573 2577 flags.go:64] FLAG: --event-burst="100" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161576 2577 flags.go:64] FLAG: --event-qps="50" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161579 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161582 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161584 2577 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161589 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161591 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161594 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161597 2577 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161600 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161603 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161606 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161608 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161612 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161614 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161617 2577 flags.go:64] FLAG: --feature-gates="" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161621 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161624 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:20:45.165555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161627 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161630 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161634 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161637 2577 flags.go:64] FLAG: --help="false" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161640 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161643 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161646 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161648 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161652 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161655 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161658 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161661 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161663 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161668 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161671 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161674 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161677 2577 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161679 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161682 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161685 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161688 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161690 2577 flags.go:64] FLAG: --lock-file="" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161693 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161696 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:20:45.166141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161699 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161704 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161707 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161710 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161713 2577 flags.go:64] FLAG: --logging-format="text" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161715 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161718 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161721 2577 flags.go:64] FLAG: --manifest-url="" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161724 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161728 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161732 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161736 2577 flags.go:64] FLAG: --max-pods="110" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161739 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161741 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161744 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161747 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161750 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161753 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161756 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161763 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161766 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161770 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161774 2577 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:20:45.166754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161777 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161782 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161785 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161788 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161791 2577 flags.go:64] FLAG: --port="10250" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161794 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161797 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09061000cbe0d7dd8" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161800 2577 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161803 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161805 2577 flags.go:64] FLAG: --register-node="true" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161808 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161811 2577 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161817 2577 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161820 2577 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161822 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161825 2577 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161828 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161832 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161834 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161838 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161840 2577 flags.go:64] FLAG: --runonce="false" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161843 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161846 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161849 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161852 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161855 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:20:45.167307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161857 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161860 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161863 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161866 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161871 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161874 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161878 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161881 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161883 2577 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161886 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161892 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161895 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161898 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161901 2577 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161904 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161907 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161909 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161912 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161916 2577 flags.go:64] FLAG: --v="2" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161920 2577 flags.go:64] FLAG: --version="false" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161925 2577 flags.go:64] FLAG: --vmodule="" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161929 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.161932 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162017 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162022 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:20:45.168000 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162025 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162028 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162030 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162033 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162037 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162040 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162043 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162046 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162049 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162052 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162054 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162058 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162061 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162064 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162071 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162075 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162078 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162081 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162083 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:20:45.168783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162086 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162088 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162091 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162093 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162096 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162098 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162102 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162105 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162107 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162109 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162112 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162115 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162118 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162120 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162123 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162125 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162128 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162130 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162133 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162135 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:20:45.169618 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162138 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162140 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162143 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162145 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162148 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162151 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162154 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162156 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162159 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162161 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162164 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162166 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162169 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162171 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162174 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162176 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162179 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162181 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162185 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162188 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:20:45.170124 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162190 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162192 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162195 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162197 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162200 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162202 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162205 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162207 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162210 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162212 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162215 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162217 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162219 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162222 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162225 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162227 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162231 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162234 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162236 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162238 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:20:45.170661 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162241 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162244 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162246 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162248 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.162251 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.162954 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.171064 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.171079 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171123 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171128 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171131 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171134 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171137 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171139 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171142 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171145 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:20:45.171189 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171147 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171150 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171152 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171156 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171159 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171162 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171164 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171167 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171169 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171172 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171174 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171176 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171179 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171181 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171184 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171187 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171190 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171192 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171195 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:20:45.171685 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171198 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171201 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171203 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171206 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171208 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171211 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171213 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171215 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171218 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171221 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171224 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171226 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171228 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171231 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171233 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171236 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171239 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171242 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171244 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171246 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:20:45.172184 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171249 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171251 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171254 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171256 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171258 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171261 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171264 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171266 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171269 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171285 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171287 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171290 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171293 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171295 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171298 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171300 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171303 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171305 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171308 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171310 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:20:45.172712 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171313 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171315 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171318 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171320 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171323 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171325 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171328 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171332 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171337 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171340 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171343 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171345 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171348 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171351 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171353 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171356 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171358 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171361 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:20:45.173288 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171364 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.171369 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171458 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171463 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171466 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171469 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171471 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171474 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171478 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171482 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171486 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171489 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171491 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171494 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171497 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:20:45.173770 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171499 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171502 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171505 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171508 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171510 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171513 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171516 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171519 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171522 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171525 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171527 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171530 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171533 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171535 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171538 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171540 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171543 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171545 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171548 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171550 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:20:45.174159 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171552 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171555 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171557 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171560 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171562 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171564 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171567 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171569 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171572 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171574 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171577 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171579 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171582 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171584 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171586 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171589 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171591 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171594 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171596 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171599 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:20:45.174672 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171601 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171603 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171608 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171611 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171613 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171616 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171618 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171621 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171623 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171626 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171628 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171630 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171633 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171635 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171638 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171640 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171642 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171645 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171647 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171650 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:20:45.175147 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171652 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171655 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171657 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171660 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171662 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171665 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171667 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171669 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171672 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171675 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171678 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171680 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:45.171683 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.171688 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.172434 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.174597 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:20:45.175727 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.175451 2577 server.go:1019] "Starting client certificate rotation" Apr 17 14:20:45.176106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.175551 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:20:45.176106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.175590 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:20:45.197802 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.197784 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:20:45.200399 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.200372 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:20:45.211345 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.211324 2577 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:20:45.216582 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.216550 2577 log.go:25] "Validated CRI v1 image API" Apr 17 14:20:45.218733 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.218709 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:20:45.220851 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.220823 2577 fs.go:135] Filesystem UUIDs: map[5dd96f39-511d-4a5a-8dda-5f1c5f3898fe:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 892c4875-4ba7-4631-b071-b5d863e5316f:/dev/nvme0n1p3] Apr 17 14:20:45.220935 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.220849 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:20:45.227429 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.227411 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:20:45.227826 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.227714 2577 manager.go:217] Machine: {Timestamp:2026-04-17 14:20:45.225822284 +0000 UTC m=+0.361829822 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3201486 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2dcd3f1fd8c009840f6bf2a741b70a SystemUUID:ec2dcd3f-1fd8-c009-840f-6bf2a741b70a BootID:ece3f4bb-3e5c-414e-9639-d7816e942508 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7a:69:c0:16:8f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7a:69:c0:16:8f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:ac:3c:a3:31:0d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:20:45.228441 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.228429 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:20:45.228537 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.228524 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:20:45.231247 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.231223 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:20:45.231396 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.231249 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-119.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:20:45.231452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.231406 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:20:45.231452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.231415 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:20:45.231452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.231428 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:20:45.232354 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.232343 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:20:45.234286 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.234264 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:20:45.234428 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.234419 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:20:45.236334 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.236325 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:20:45.236369 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.236337 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:20:45.236369 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.236348 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:20:45.236369 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.236357 2577 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:20:45.236369 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.236365 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:20:45.237217 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.237205 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:20:45.237257 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.237223 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:20:45.240035 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.240018 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:20:45.241457 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.241444 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:20:45.243064 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243052 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243069 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243075 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243081 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243086 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243092 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243098 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243103 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:20:45.243107 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243110 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:20:45.243329 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243116 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:20:45.243329 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243125 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:20:45.243329 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243133 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:20:45.243866 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243856 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:20:45.243896 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.243867 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:20:45.247415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.247401 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:20:45.247458 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.247445 2577 server.go:1295] "Started kubelet" Apr 17 14:20:45.247605 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.247535 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:20:45.247664 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.247592 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:20:45.247664 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.247657 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:20:45.248089 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.248048 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-119.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 14:20:45.248188 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.248164 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-119.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 14:20:45.248236 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.248186 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 14:20:45.248432 ip-10-0-132-119 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:20:45.248548 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.248537 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:20:45.251218 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.251202 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:20:45.254708 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.253909 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-119.ec2.internal.18a72aceb1506b7e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-119.ec2.internal,UID:ip-10-0-132-119.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-119.ec2.internal,},FirstTimestamp:2026-04-17 14:20:45.247417214 +0000 UTC m=+0.383424754,LastTimestamp:2026-04-17 14:20:45.247417214 +0000 UTC m=+0.383424754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-119.ec2.internal,}" Apr 17 14:20:45.256370 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.256351 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:20:45.256766 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.256746 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:20:45.256854 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.256792 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:20:45.257380 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257364 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:20:45.257455 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257367 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:20:45.257455 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257408 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:20:45.257558 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257529 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:20:45.257558 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257537 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:20:45.257642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257584 2577 factory.go:55] Registering systemd factory Apr 17 14:20:45.257642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257614 2577 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:20:45.257642 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.257617 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.257915 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257899 2577 factory.go:153] Registering CRI-O factory Apr 17 14:20:45.257915 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257914 2577 factory.go:223] Registration of the crio container factory successfully Apr 17 14:20:45.258034 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257961 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:20:45.258034 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257980 2577 factory.go:103] Registering Raw factory Apr 17 14:20:45.258034 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.257992 2577 manager.go:1196] Started watching for new ooms in manager Apr 17 14:20:45.258553 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.258531 2577 manager.go:319] Starting recovery of all containers Apr 17 14:20:45.260081 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.260052 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 14:20:45.260252 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.260213 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-119.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 14:20:45.269455 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.269286 2577 manager.go:324] Recovery completed Apr 17 14:20:45.273341 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.273324 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jssk4" Apr 17 14:20:45.273730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.273719 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:20:45.276063 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.276051 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:20:45.276135 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.276075 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:20:45.276135 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.276085 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:20:45.276595 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.276581 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:20:45.276595 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.276595 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:20:45.276698 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.276611 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:20:45.278217 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.278155 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-119.ec2.internal.18a72aceb30583eb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-119.ec2.internal,UID:ip-10-0-132-119.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-119.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-119.ec2.internal,},FirstTimestamp:2026-04-17 14:20:45.276062699 +0000 UTC m=+0.412070237,LastTimestamp:2026-04-17 14:20:45.276062699 +0000 UTC m=+0.412070237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-119.ec2.internal,}" Apr 17 14:20:45.280389 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.280373 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jssk4" Apr 17 14:20:45.280463 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.280424 2577 policy_none.go:49] "None policy: Start" Apr 17 14:20:45.280463 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.280442 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:20:45.280463 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.280455 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:20:45.313842 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.313825 2577 manager.go:341] "Starting Device Plugin manager" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.313856 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.313865 2577 server.go:85] "Starting device plugin registration server" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.314088 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.314131 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.314292 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.314395 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.314407 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.314778 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:20:45.340715 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.314816 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.366406 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.366363 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:20:45.367449 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.367432 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:20:45.367515 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.367457 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:20:45.367550 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.367518 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:20:45.367550 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.367525 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:20:45.367643 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.367560 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:20:45.370164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.370144 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:20:45.414517 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.414477 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:20:45.415780 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.415756 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:20:45.415862 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.415785 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:20:45.415862 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.415795 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:20:45.415862 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.415826 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.426073 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.426052 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.426073 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.426074 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-119.ec2.internal\": node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.440642 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.440625 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.468599 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.468576 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal"] Apr 17 14:20:45.468697 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.468656 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:20:45.469517 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.469498 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:20:45.469594 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.469530 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:20:45.469594 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.469540 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:20:45.471907 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.471895 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:20:45.472049 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.472092 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472070 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:20:45.472663 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472648 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:20:45.472723 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472678 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:20:45.472723 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472689 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:20:45.472723 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472650 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:20:45.472805 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472752 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:20:45.472805 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.472765 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:20:45.474997 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.474976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.474997 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.474999 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:20:45.475607 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.475594 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:20:45.475669 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.475617 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:20:45.475669 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.475627 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:20:45.508909 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.508890 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-119.ec2.internal\" not found" node="ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.513127 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.513113 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-119.ec2.internal\" not found" node="ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.540821 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.540804 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.559119 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.559098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f246f22bb4cc0196c51370027afd49f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-119.ec2.internal\" (UID: \"f246f22bb4cc0196c51370027afd49f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.559183 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.559126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e3b19c5aa18b811bd305847b1c45b39d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal\" (UID: \"e3b19c5aa18b811bd305847b1c45b39d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.559183 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.559145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b19c5aa18b811bd305847b1c45b39d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal\" (UID: \"e3b19c5aa18b811bd305847b1c45b39d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.641263 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.641234 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.659593 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.659560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e3b19c5aa18b811bd305847b1c45b39d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal\" (UID: \"e3b19c5aa18b811bd305847b1c45b39d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.659668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.659602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b19c5aa18b811bd305847b1c45b39d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal\" (UID: \"e3b19c5aa18b811bd305847b1c45b39d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.659668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.659624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f246f22bb4cc0196c51370027afd49f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-119.ec2.internal\" (UID: \"f246f22bb4cc0196c51370027afd49f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.659668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.659664 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f246f22bb4cc0196c51370027afd49f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-119.ec2.internal\" (UID: \"f246f22bb4cc0196c51370027afd49f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.659751 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.659660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b19c5aa18b811bd305847b1c45b39d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal\" (UID: \"e3b19c5aa18b811bd305847b1c45b39d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.659751 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.659660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e3b19c5aa18b811bd305847b1c45b39d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal\" (UID: \"e3b19c5aa18b811bd305847b1c45b39d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.742021 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.741948 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.811520 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.811482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.816439 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:45.816417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" Apr 17 14:20:45.842622 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.842597 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:45.943096 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:45.943064 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:46.043696 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:46.043624 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:46.144112 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:46.144085 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:46.175578 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.175554 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:20:46.176031 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.175691 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:20:46.245161 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:46.245135 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:46.256468 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.256444 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:20:46.271715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.271693 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:20:46.283466 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.283443 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:15:45 +0000 UTC" deadline="2027-12-16 19:22:51.155182894 +0000 UTC" Apr 17 14:20:46.283466 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.283464 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14597h2m4.871721842s" Apr 17 14:20:46.293990 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.293930 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8df49" Apr 17 14:20:46.301197 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.301180 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8df49" Apr 17 14:20:46.346008 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:46.345985 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:46.446365 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:46.446335 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:46.546881 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:46.546801 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-119.ec2.internal\" not found" Apr 17 14:20:46.602586 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.602562 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:20:46.636171 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:46.636137 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf246f22bb4cc0196c51370027afd49f2.slice/crio-bdcc3d02b33d16839d3525cd2446d9efe837a233a75559196762d22accda30d9 WatchSource:0}: Error finding container bdcc3d02b33d16839d3525cd2446d9efe837a233a75559196762d22accda30d9: Status 404 returned error can't find the container with id bdcc3d02b33d16839d3525cd2446d9efe837a233a75559196762d22accda30d9 Apr 17 14:20:46.636488 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:46.636467 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b19c5aa18b811bd305847b1c45b39d.slice/crio-849bc663097117140a9077476bf643e48f5b053351e8dda3c81bc821ad1e97b0 WatchSource:0}: Error finding container 849bc663097117140a9077476bf643e48f5b053351e8dda3c81bc821ad1e97b0: Status 404 returned error can't find the container with id 849bc663097117140a9077476bf643e48f5b053351e8dda3c81bc821ad1e97b0 Apr 17 14:20:46.641265 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.641249 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:20:46.657440 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.657421 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" Apr 17 14:20:46.667302 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.667284 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:20:46.668673 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.668661 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" Apr 17 14:20:46.679720 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.679706 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:20:46.727847 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.727823 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:20:46.782838 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:46.782814 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:20:47.237980 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.237943 2577 apiserver.go:52] "Watching apiserver" Apr 17 14:20:47.247631 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.247467 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:20:47.248787 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.248696 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-nz4xc","kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh","openshift-cluster-node-tuning-operator/tuned-r8vfn","openshift-dns/node-resolver-gqkmm","openshift-image-registry/node-ca-d8pjn","openshift-multus/multus-s6jws","openshift-multus/network-metrics-daemon-9qc7k","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal","openshift-multus/multus-additional-cni-plugins-25j4z","openshift-network-diagnostics/network-check-target-xxhjb","openshift-network-operator/iptables-alerter-8mjfq","openshift-ovn-kubernetes/ovnkube-node-zns76"] Apr 17 14:20:47.251901 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.251819 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.255224 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.254926 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:20:47.255224 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.255074 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q5qqg\"" Apr 17 14:20:47.255224 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.255094 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:20:47.255224 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.255128 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:20:47.255518 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.255366 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:20:47.256493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.256023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.258750 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.258633 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.259683 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.259581 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:20:47.259789 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.259744 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:20:47.261187 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.260927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.263526 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.263506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.266015 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.265727 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:20:47.266015 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.265838 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:20:47.266015 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.265838 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:20:47.266015 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.265888 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:20:47.266015 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.265915 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5nmjv\"" Apr 17 14:20:47.266015 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.266011 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-v2v5k\"" Apr 17 14:20:47.266828 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.266625 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.266828 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.266741 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:47.266959 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.266877 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:20:47.267870 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.267850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-shpd6\"" Apr 17 14:20:47.268061 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268045 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:20:47.268376 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268360 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fnm6z\"" Apr 17 14:20:47.268452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-run\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.268452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268406 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:20:47.268452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-host\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.268452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z77g\" (UniqueName: \"kubernetes.io/projected/29001604-5bfa-4625-8130-d60e9a67c29f-kube-api-access-5z77g\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-netns\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-cni-multus\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-kubernetes\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29001604-5bfa-4625-8130-d60e9a67c29f-etc-tuned\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bv7\" (UniqueName: \"kubernetes.io/projected/565f7614-6003-428c-a0bd-ff0f395baa33-kube-api-access-r2bv7\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-modprobe-d\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-cni-bin\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-sys-fs\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.268642 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-system-cni-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268682 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-cni-binary-copy\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-hostroot\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99jz7\" (UniqueName: \"kubernetes.io/projected/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-kube-api-access-99jz7\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysctl-conf\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/565f7614-6003-428c-a0bd-ff0f395baa33-tmp-dir\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-cnibin\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-conf-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-daemon-config\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-sys\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268924 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/565f7614-6003-428c-a0bd-ff0f395baa33-hosts-file\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.268973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-socket-dir-parent\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-multus-certs\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsgn\" (UniqueName: \"kubernetes.io/projected/f5eccb1a-2f8e-47ca-8e8e-327f03634196-kube-api-access-dgsgn\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269050 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29001604-5bfa-4625-8130-d60e9a67c29f-tmp\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-socket-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-registration-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-device-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269226 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-systemd\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-k8s-cni-cncf-io\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269298 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-etc-kubernetes\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269326 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysconfig\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-lib-modules\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-var-lib-kubelet\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-cni-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269419 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-os-release\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-kubelet\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269470 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysctl-d\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.269730 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269687 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:20:47.270562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269795 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nkhvw\"" Apr 17 14:20:47.270562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.269863 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:20:47.270562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.270375 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.272539 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.272520 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:47.272632 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.272589 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:20:47.273826 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.273701 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:20:47.273922 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.273906 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:20:47.274556 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.274538 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6hqpg\"" Apr 17 14:20:47.275918 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.275892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.278417 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.278381 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:20:47.278417 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.278400 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.278608 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.278591 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:20:47.278778 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.278761 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:20:47.278963 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.278947 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7vwj\"" Apr 17 14:20:47.283313 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.283290 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:20:47.283616 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.283600 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:20:47.283826 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.283810 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:20:47.283895 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.283848 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:20:47.283994 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.283979 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:20:47.284046 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.284030 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:20:47.284204 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.284189 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b5swr\"" Apr 17 14:20:47.303557 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.303529 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:15:46 +0000 UTC" deadline="2028-01-03 08:47:14.989494622 +0000 UTC" Apr 17 14:20:47.303557 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.303557 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15018h26m27.685940598s" Apr 17 14:20:47.359101 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.359079 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:20:47.370177 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-cnibin\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.370314 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-sys\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.370314 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/565f7614-6003-428c-a0bd-ff0f395baa33-hosts-file\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.370314 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-cnibin\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.370314 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-cni-netd\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.370314 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-host\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.370314 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-sys\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-socket-dir-parent\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/565f7614-6003-428c-a0bd-ff0f395baa33-hosts-file\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-multus-certs\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-socket-dir-parent\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsgn\" (UniqueName: \"kubernetes.io/projected/f5eccb1a-2f8e-47ca-8e8e-327f03634196-kube-api-access-dgsgn\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29001604-5bfa-4625-8130-d60e9a67c29f-tmp\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370464 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-multus-certs\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-slash\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-socket-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.370617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370617 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-registration-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370664 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-registration-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-socket-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-device-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-kubelet\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-device-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-etc-kubernetes\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370877 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysconfig\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-etc-kubernetes\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-system-cni-dir\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysconfig\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d030cbc2-cc1b-40c2-8101-cd9ed0460d1e-agent-certs\") pod \"konnectivity-agent-nz4xc\" (UID: \"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e\") " pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.370972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-os-release\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-kubelet\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/419f4063-d963-4db2-b104-57f7214aaee2-iptables-alerter-script\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.371105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-os-release\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-kubelet\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-ovnkube-config\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bv7\" (UniqueName: \"kubernetes.io/projected/565f7614-6003-428c-a0bd-ff0f395baa33-kube-api-access-r2bv7\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-node-log\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371185 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-run-ovn-kubernetes\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25426a81-851e-4273-9383-f90518fec0e7-ovn-node-metrics-cert\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-ovnkube-script-lib\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371308 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-modprobe-d\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-cni-bin\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371421 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-cni-bin\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-modprobe-d\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-sys-fs\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d030cbc2-cc1b-40c2-8101-cd9ed0460d1e-konnectivity-ca\") pod \"konnectivity-agent-nz4xc\" (UID: \"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e\") " pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-sys-fs\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.371861 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-system-cni-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-hostroot\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysctl-conf\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-system-cni-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-var-lib-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-serviceca\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-conf-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-daemon-config\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-os-release\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysctl-conf\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-conf-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371735 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-systemd-units\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371761 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-hostroot\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-log-socket\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-cni-bin\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.372655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-systemd\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk2r7\" (UniqueName: \"kubernetes.io/projected/e40076ff-ba56-43e8-88a4-8c25998b6668-kube-api-access-gk2r7\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5pfh\" (UniqueName: \"kubernetes.io/projected/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-kube-api-access-j5pfh\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.371984 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-k8s-cni-cncf-io\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-lib-modules\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f5eccb1a-2f8e-47ca-8e8e-327f03634196-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-var-lib-kubelet\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-cnibin\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-systemd\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-systemd\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-cni-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysctl-d\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-run\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372176 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-host\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-k8s-cni-cncf-io\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.373424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-var-lib-kubelet\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-lib-modules\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5z77g\" (UniqueName: \"kubernetes.io/projected/29001604-5bfa-4625-8130-d60e9a67c29f-kube-api-access-5z77g\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-daemon-config\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372307 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-host\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372333 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-sysctl-d\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67n5\" (UniqueName: \"kubernetes.io/projected/419f4063-d963-4db2-b104-57f7214aaee2-kube-api-access-f67n5\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-multus-cni-dir\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372355 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-run\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372389 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-ovn\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-env-overrides\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-netns\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-cni-multus\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-kubernetes\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-run-netns\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-host-var-lib-cni-multus\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29001604-5bfa-4625-8130-d60e9a67c29f-etc-tuned\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29001604-5bfa-4625-8130-d60e9a67c29f-etc-kubernetes\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.374161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-cni-binary-copy\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-run-netns\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-etc-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkz2g\" (UniqueName: \"kubernetes.io/projected/25426a81-851e-4273-9383-f90518fec0e7-kube-api-access-mkz2g\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqkc8\" (UniqueName: \"kubernetes.io/projected/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-kube-api-access-vqkc8\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-cni-binary-copy\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99jz7\" (UniqueName: \"kubernetes.io/projected/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-kube-api-access-99jz7\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/565f7614-6003-428c-a0bd-ff0f395baa33-tmp-dir\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/419f4063-d963-4db2-b104-57f7214aaee2-host-slash\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.372854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.373067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" event={"ID":"f246f22bb4cc0196c51370027afd49f2","Type":"ContainerStarted","Data":"bdcc3d02b33d16839d3525cd2446d9efe837a233a75559196762d22accda30d9"} Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.373302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/565f7614-6003-428c-a0bd-ff0f395baa33-tmp-dir\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.373557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-cni-binary-copy\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.374955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.374511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" event={"ID":"e3b19c5aa18b811bd305847b1c45b39d","Type":"ContainerStarted","Data":"849bc663097117140a9077476bf643e48f5b053351e8dda3c81bc821ad1e97b0"} Apr 17 14:20:47.376379 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.376359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29001604-5bfa-4625-8130-d60e9a67c29f-tmp\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.377028 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.377011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/29001604-5bfa-4625-8130-d60e9a67c29f-etc-tuned\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.382458 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.382435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsgn\" (UniqueName: \"kubernetes.io/projected/f5eccb1a-2f8e-47ca-8e8e-327f03634196-kube-api-access-dgsgn\") pod \"aws-ebs-csi-driver-node-pvkwh\" (UID: \"f5eccb1a-2f8e-47ca-8e8e-327f03634196\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.382558 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.382443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z77g\" (UniqueName: \"kubernetes.io/projected/29001604-5bfa-4625-8130-d60e9a67c29f-kube-api-access-5z77g\") pod \"tuned-r8vfn\" (UID: \"29001604-5bfa-4625-8130-d60e9a67c29f\") " pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.383635 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.383537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99jz7\" (UniqueName: \"kubernetes.io/projected/60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3-kube-api-access-99jz7\") pod \"multus-s6jws\" (UID: \"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3\") " pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.384186 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.384053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bv7\" (UniqueName: \"kubernetes.io/projected/565f7614-6003-428c-a0bd-ff0f395baa33-kube-api-access-r2bv7\") pod \"node-resolver-gqkmm\" (UID: \"565f7614-6003-428c-a0bd-ff0f395baa33\") " pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-os-release\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-os-release\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-systemd-units\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-systemd-units\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-log-socket\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-log-socket\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-cni-bin\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk2r7\" (UniqueName: \"kubernetes.io/projected/e40076ff-ba56-43e8-88a4-8c25998b6668-kube-api-access-gk2r7\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5pfh\" (UniqueName: \"kubernetes.io/projected/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-kube-api-access-j5pfh\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-cnibin\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473599 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-systemd\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f67n5\" (UniqueName: \"kubernetes.io/projected/419f4063-d963-4db2-b104-57f7214aaee2-kube-api-access-f67n5\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-ovn\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-env-overrides\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-cni-binary-copy\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.475524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-cnibin\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-run-netns\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-run-netns\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473810 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-etc-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-etc-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkz2g\" (UniqueName: \"kubernetes.io/projected/25426a81-851e-4273-9383-f90518fec0e7-kube-api-access-mkz2g\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-cni-bin\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqkc8\" (UniqueName: \"kubernetes.io/projected/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-kube-api-access-vqkc8\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/419f4063-d963-4db2-b104-57f7214aaee2-host-slash\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.473978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-cni-netd\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-host\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-slash\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-kubelet\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.476415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-system-cni-dir\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d030cbc2-cc1b-40c2-8101-cd9ed0460d1e-agent-certs\") pod \"konnectivity-agent-nz4xc\" (UID: \"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e\") " pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-systemd\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/419f4063-d963-4db2-b104-57f7214aaee2-iptables-alerter-script\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-ovnkube-config\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-node-log\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-run-ovn-kubernetes\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25426a81-851e-4273-9383-f90518fec0e7-ovn-node-metrics-cert\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-ovnkube-script-lib\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d030cbc2-cc1b-40c2-8101-cd9ed0460d1e-konnectivity-ca\") pod \"konnectivity-agent-nz4xc\" (UID: \"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e\") " pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/419f4063-d963-4db2-b104-57f7214aaee2-host-slash\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474530 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-var-lib-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-serviceca\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-slash\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-env-overrides\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-run-ovn\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.474987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-cni-binary-copy\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.475011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-host\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.475044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-cni-netd\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.475044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-node-log\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.475093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e40076ff-ba56-43e8-88a4-8c25998b6668-system-cni-dir\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.475158 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-run-ovn-kubernetes\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.475187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-host-kubelet\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.475264 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:47.477899 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.475463 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:47.975430282 +0000 UTC m=+3.111437825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:47.481135 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.479905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-ovnkube-config\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.481135 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.480709 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25426a81-851e-4273-9383-f90518fec0e7-var-lib-openvswitch\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.481345 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.481086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.481345 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.481288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d030cbc2-cc1b-40c2-8101-cd9ed0460d1e-konnectivity-ca\") pod \"konnectivity-agent-nz4xc\" (UID: \"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e\") " pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.481661 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.481627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/419f4063-d963-4db2-b104-57f7214aaee2-iptables-alerter-script\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.482029 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.481999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d030cbc2-cc1b-40c2-8101-cd9ed0460d1e-agent-certs\") pod \"konnectivity-agent-nz4xc\" (UID: \"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e\") " pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.482168 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.482135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e40076ff-ba56-43e8-88a4-8c25998b6668-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.483453 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.483433 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:20:47.483538 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.483462 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:20:47.483538 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.475438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-serviceca\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.483538 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.483477 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zng67 for pod openshift-network-diagnostics/network-check-target-xxhjb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:47.483904 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.483877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25426a81-851e-4273-9383-f90518fec0e7-ovnkube-script-lib\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.484054 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.484039 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67 podName:d5fae135-d20e-469d-bb28-4d7236b5f86a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:47.984019575 +0000 UTC m=+3.120027112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zng67" (UniqueName: "kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67") pod "network-check-target-xxhjb" (UID: "d5fae135-d20e-469d-bb28-4d7236b5f86a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:47.485041 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.485017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25426a81-851e-4273-9383-f90518fec0e7-ovn-node-metrics-cert\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.488307 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.488214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqkc8\" (UniqueName: \"kubernetes.io/projected/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-kube-api-access-vqkc8\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:47.489179 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.489156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk2r7\" (UniqueName: \"kubernetes.io/projected/e40076ff-ba56-43e8-88a4-8c25998b6668-kube-api-access-gk2r7\") pod \"multus-additional-cni-plugins-25j4z\" (UID: \"e40076ff-ba56-43e8-88a4-8c25998b6668\") " pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.489667 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.489638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67n5\" (UniqueName: \"kubernetes.io/projected/419f4063-d963-4db2-b104-57f7214aaee2-kube-api-access-f67n5\") pod \"iptables-alerter-8mjfq\" (UID: \"419f4063-d963-4db2-b104-57f7214aaee2\") " pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.490465 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.490445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5pfh\" (UniqueName: \"kubernetes.io/projected/26b0d440-2cba-4402-b258-ba4b4ac2f7dd-kube-api-access-j5pfh\") pod \"node-ca-d8pjn\" (UID: \"26b0d440-2cba-4402-b258-ba4b4ac2f7dd\") " pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.491733 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.491716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkz2g\" (UniqueName: \"kubernetes.io/projected/25426a81-851e-4273-9383-f90518fec0e7-kube-api-access-mkz2g\") pod \"ovnkube-node-zns76\" (UID: \"25426a81-851e-4273-9383-f90518fec0e7\") " pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.566912 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.566884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6jws" Apr 17 14:20:47.574003 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.573980 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:20:47.574475 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:47.574447 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e3021c_3d8c_46bc_b4d0_5b2c13f6f6e3.slice/crio-e7b3029846d571e4141b9877d15dc8f759957f78714fde00fba1ed53254574be WatchSource:0}: Error finding container e7b3029846d571e4141b9877d15dc8f759957f78714fde00fba1ed53254574be: Status 404 returned error can't find the container with id e7b3029846d571e4141b9877d15dc8f759957f78714fde00fba1ed53254574be Apr 17 14:20:47.577820 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.577799 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" Apr 17 14:20:47.586803 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.586522 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" Apr 17 14:20:47.587669 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:47.587645 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29001604_5bfa_4625_8130_d60e9a67c29f.slice/crio-1d54ff4716b83c4f33ec3db6211ee72d16ac1b56f286bc94daa819d8a4c75e81 WatchSource:0}: Error finding container 1d54ff4716b83c4f33ec3db6211ee72d16ac1b56f286bc94daa819d8a4c75e81: Status 404 returned error can't find the container with id 1d54ff4716b83c4f33ec3db6211ee72d16ac1b56f286bc94daa819d8a4c75e81 Apr 17 14:20:47.594267 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.594247 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gqkmm" Apr 17 14:20:47.594559 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:47.594527 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5eccb1a_2f8e_47ca_8e8e_327f03634196.slice/crio-c7808e366a7be4744922bebdb1e08ed016c241e9c2d62aa5a8479ff2e2ff3322 WatchSource:0}: Error finding container c7808e366a7be4744922bebdb1e08ed016c241e9c2d62aa5a8479ff2e2ff3322: Status 404 returned error can't find the container with id c7808e366a7be4744922bebdb1e08ed016c241e9c2d62aa5a8479ff2e2ff3322 Apr 17 14:20:47.602245 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:47.602224 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod565f7614_6003_428c_a0bd_ff0f395baa33.slice/crio-7ec7a71738db7a060a16713291354983ee977c20308d67f86a08bec4d1f5770b WatchSource:0}: Error finding container 7ec7a71738db7a060a16713291354983ee977c20308d67f86a08bec4d1f5770b: Status 404 returned error can't find the container with id 7ec7a71738db7a060a16713291354983ee977c20308d67f86a08bec4d1f5770b Apr 17 14:20:47.604074 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.604048 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:20:47.610600 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.610580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d8pjn" Apr 17 14:20:47.611402 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:47.611378 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd030cbc2_cc1b_40c2_8101_cd9ed0460d1e.slice/crio-254d2fbcdda0bdfd5c574b68a2b75342e0e3980ecc87916622c4c22867d7468b WatchSource:0}: Error finding container 254d2fbcdda0bdfd5c574b68a2b75342e0e3980ecc87916622c4c22867d7468b: Status 404 returned error can't find the container with id 254d2fbcdda0bdfd5c574b68a2b75342e0e3980ecc87916622c4c22867d7468b Apr 17 14:20:47.618977 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:47.618949 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26b0d440_2cba_4402_b258_ba4b4ac2f7dd.slice/crio-3372e1c91298ee55bf025961481eab7e1d5958dfd07ca32948de9943e18ba5a8 WatchSource:0}: Error finding container 3372e1c91298ee55bf025961481eab7e1d5958dfd07ca32948de9943e18ba5a8: Status 404 returned error can't find the container with id 3372e1c91298ee55bf025961481eab7e1d5958dfd07ca32948de9943e18ba5a8 Apr 17 14:20:47.619790 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.619749 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25j4z" Apr 17 14:20:47.627443 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:47.627423 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40076ff_ba56_43e8_88a4_8c25998b6668.slice/crio-d588934e9d07f1f12b9061bdeca5bba325ac43ab214fd5ab227681f891179b9c WatchSource:0}: Error finding container d588934e9d07f1f12b9061bdeca5bba325ac43ab214fd5ab227681f891179b9c: Status 404 returned error can't find the container with id d588934e9d07f1f12b9061bdeca5bba325ac43ab214fd5ab227681f891179b9c Apr 17 14:20:47.629376 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.629359 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8mjfq" Apr 17 14:20:47.637682 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.637661 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:20:47.978168 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:47.978132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:47.978351 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.978320 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:47.978422 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:47.978391 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:48.978372226 +0000 UTC m=+4.114379768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:48.079326 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.079294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:48.079503 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:48.079446 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:20:48.079503 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:48.079467 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:20:48.079503 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:48.079480 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zng67 for pod openshift-network-diagnostics/network-check-target-xxhjb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:48.079653 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:48.079536 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67 podName:d5fae135-d20e-469d-bb28-4d7236b5f86a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:49.079519255 +0000 UTC m=+4.215526824 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zng67" (UniqueName: "kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67") pod "network-check-target-xxhjb" (UID: "d5fae135-d20e-469d-bb28-4d7236b5f86a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:48.304247 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.304168 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:15:46 +0000 UTC" deadline="2027-10-15 16:51:36.98364139 +0000 UTC" Apr 17 14:20:48.304247 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.304206 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13106h30m48.679439166s" Apr 17 14:20:48.377039 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.376999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerStarted","Data":"d588934e9d07f1f12b9061bdeca5bba325ac43ab214fd5ab227681f891179b9c"} Apr 17 14:20:48.378209 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.378153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nz4xc" event={"ID":"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e","Type":"ContainerStarted","Data":"254d2fbcdda0bdfd5c574b68a2b75342e0e3980ecc87916622c4c22867d7468b"} Apr 17 14:20:48.379493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.379443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gqkmm" event={"ID":"565f7614-6003-428c-a0bd-ff0f395baa33","Type":"ContainerStarted","Data":"7ec7a71738db7a060a16713291354983ee977c20308d67f86a08bec4d1f5770b"} Apr 17 14:20:48.380563 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.380536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" event={"ID":"f5eccb1a-2f8e-47ca-8e8e-327f03634196","Type":"ContainerStarted","Data":"c7808e366a7be4744922bebdb1e08ed016c241e9c2d62aa5a8479ff2e2ff3322"} Apr 17 14:20:48.381702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.381660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" event={"ID":"29001604-5bfa-4625-8130-d60e9a67c29f","Type":"ContainerStarted","Data":"1d54ff4716b83c4f33ec3db6211ee72d16ac1b56f286bc94daa819d8a4c75e81"} Apr 17 14:20:48.382835 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.382812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6jws" event={"ID":"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3","Type":"ContainerStarted","Data":"e7b3029846d571e4141b9877d15dc8f759957f78714fde00fba1ed53254574be"} Apr 17 14:20:48.384241 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.384214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d8pjn" event={"ID":"26b0d440-2cba-4402-b258-ba4b4ac2f7dd","Type":"ContainerStarted","Data":"3372e1c91298ee55bf025961481eab7e1d5958dfd07ca32948de9943e18ba5a8"} Apr 17 14:20:48.686730 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:20:48.686683 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419f4063_d963_4db2_b104_57f7214aaee2.slice/crio-3b141e0f476cb8b54b1c3abadbea78bc308bb8e8760009823cce8ce807797a55 WatchSource:0}: Error finding container 3b141e0f476cb8b54b1c3abadbea78bc308bb8e8760009823cce8ce807797a55: Status 404 returned error can't find the container with id 3b141e0f476cb8b54b1c3abadbea78bc308bb8e8760009823cce8ce807797a55 Apr 17 14:20:48.988956 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:48.988104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:48.988956 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:48.988338 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:48.988956 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:48.988534 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:50.98851482 +0000 UTC m=+6.124522359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:49.089773 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:49.089735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:49.089953 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:49.089891 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:20:49.089953 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:49.089910 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:20:49.089953 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:49.089922 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zng67 for pod openshift-network-diagnostics/network-check-target-xxhjb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:49.090103 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:49.089976 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67 podName:d5fae135-d20e-469d-bb28-4d7236b5f86a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:51.089959127 +0000 UTC m=+6.225966670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zng67" (UniqueName: "kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67") pod "network-check-target-xxhjb" (UID: "d5fae135-d20e-469d-bb28-4d7236b5f86a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:49.371901 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:49.371858 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:49.372401 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:49.371994 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:20:49.372468 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:49.372415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:49.372517 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:49.372498 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:20:49.404409 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:49.402713 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" event={"ID":"f246f22bb4cc0196c51370027afd49f2","Type":"ContainerStarted","Data":"c5b75e97777d53eb1cb72f6a4fd3000050e6350e163a70f507ed60fac5ab1ee3"} Apr 17 14:20:49.407118 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:49.407071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"423bb6c6c5a3b592f9adcd4a4a10434bd36fe9bc573c33584161ddd13d60dd13"} Apr 17 14:20:49.412912 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:49.412886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8mjfq" event={"ID":"419f4063-d963-4db2-b104-57f7214aaee2","Type":"ContainerStarted","Data":"3b141e0f476cb8b54b1c3abadbea78bc308bb8e8760009823cce8ce807797a55"} Apr 17 14:20:50.425055 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:50.424985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" event={"ID":"e3b19c5aa18b811bd305847b1c45b39d","Type":"ContainerStarted","Data":"e20ac99c9b460af95e4035c4d08ddab6ba84fe7063e34a0d8542b1b1aec71a81"} Apr 17 14:20:50.437860 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:50.437797 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-119.ec2.internal" podStartSLOduration=4.437779238 podStartE2EDuration="4.437779238s" podCreationTimestamp="2026-04-17 14:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:20:49.416709888 +0000 UTC m=+4.552717432" watchObservedRunningTime="2026-04-17 14:20:50.437779238 +0000 UTC m=+5.573786786" Apr 17 14:20:51.008007 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:51.007969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:51.008191 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.008157 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:51.008246 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.008218 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:55.008200176 +0000 UTC m=+10.144207708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:51.109950 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:51.109318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:51.109950 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.109516 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:20:51.109950 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.109535 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:20:51.109950 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.109548 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zng67 for pod openshift-network-diagnostics/network-check-target-xxhjb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:51.109950 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.109606 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67 podName:d5fae135-d20e-469d-bb28-4d7236b5f86a nodeName:}" failed. No retries permitted until 2026-04-17 14:20:55.109587766 +0000 UTC m=+10.245595298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zng67" (UniqueName: "kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67") pod "network-check-target-xxhjb" (UID: "d5fae135-d20e-469d-bb28-4d7236b5f86a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:51.369480 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:51.368457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:51.369480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.368594 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:20:51.369480 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:51.369333 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:51.369480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:51.369439 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:20:52.429266 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:52.429231 2577 generic.go:358] "Generic (PLEG): container finished" podID="e3b19c5aa18b811bd305847b1c45b39d" containerID="e20ac99c9b460af95e4035c4d08ddab6ba84fe7063e34a0d8542b1b1aec71a81" exitCode=0 Apr 17 14:20:52.429671 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:52.429308 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" event={"ID":"e3b19c5aa18b811bd305847b1c45b39d","Type":"ContainerDied","Data":"e20ac99c9b460af95e4035c4d08ddab6ba84fe7063e34a0d8542b1b1aec71a81"} Apr 17 14:20:53.368167 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:53.368133 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:53.368345 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:53.368133 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:53.368345 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:53.368267 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:20:53.368535 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:53.368409 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:20:55.039065 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:55.038971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:55.039546 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.039115 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:55.039546 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.039180 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:21:03.039160675 +0000 UTC m=+18.175168219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:20:55.140732 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:55.140367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:55.140732 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.140552 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:20:55.140732 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.140577 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:20:55.140732 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.140592 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zng67 for pod openshift-network-diagnostics/network-check-target-xxhjb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:55.140732 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.140649 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67 podName:d5fae135-d20e-469d-bb28-4d7236b5f86a nodeName:}" failed. No retries permitted until 2026-04-17 14:21:03.140632038 +0000 UTC m=+18.276639566 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zng67" (UniqueName: "kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67") pod "network-check-target-xxhjb" (UID: "d5fae135-d20e-469d-bb28-4d7236b5f86a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:20:55.369581 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:55.369111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:55.369581 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.369231 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:20:55.369581 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:55.369505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:55.369856 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:55.369614 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:20:57.368243 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:57.368132 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:57.368243 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:57.368188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:57.368728 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:57.368295 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:20:57.368728 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:57.368415 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:20:59.368381 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:59.368340 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:20:59.368893 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:20:59.368340 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:20:59.368893 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:59.368487 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:20:59.368893 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:20:59.368557 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:00.444156 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.443908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" event={"ID":"e3b19c5aa18b811bd305847b1c45b39d","Type":"ContainerStarted","Data":"2106d3dfb226b8113072b6d3ca9f56f78925daae4684d3e3744c36cbb73f7d07"} Apr 17 14:21:00.445341 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.445311 2577 generic.go:358] "Generic (PLEG): container finished" podID="e40076ff-ba56-43e8-88a4-8c25998b6668" containerID="5a9cfcc7f222bc6177c33c50bc8462d6ba677f508913db50352654144757e161" exitCode=0 Apr 17 14:21:00.445471 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.445392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerDied","Data":"5a9cfcc7f222bc6177c33c50bc8462d6ba677f508913db50352654144757e161"} Apr 17 14:21:00.446835 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.446761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nz4xc" event={"ID":"d030cbc2-cc1b-40c2-8101-cd9ed0460d1e","Type":"ContainerStarted","Data":"d2fc5cdda2e9f587576f498aa54cb0e8e0b97deec0605fc31a3f0bb246abc22a"} Apr 17 14:21:00.448142 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.448116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gqkmm" event={"ID":"565f7614-6003-428c-a0bd-ff0f395baa33","Type":"ContainerStarted","Data":"dcd33ee66de7002e4362b61fdbf4648b7870da12b5202ba6ed607c0a338face5"} Apr 17 14:21:00.449508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.449486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" event={"ID":"f5eccb1a-2f8e-47ca-8e8e-327f03634196","Type":"ContainerStarted","Data":"e859a15d7fb034bb356d9388fc1b6c6a23be56b2ba85845b2bbd848bb825e0c1"} Apr 17 14:21:00.450777 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.450752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" event={"ID":"29001604-5bfa-4625-8130-d60e9a67c29f","Type":"ContainerStarted","Data":"371f8f95c06924bd9ece28b61617d15eefba296b519cc7b16873c6611b38f19e"} Apr 17 14:21:00.452027 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.452005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6jws" event={"ID":"60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3","Type":"ContainerStarted","Data":"a210bac8e8a3e37fbbe5654583fc2cc759ecd61c346fd676103657267800468c"} Apr 17 14:21:00.453220 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.453202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d8pjn" event={"ID":"26b0d440-2cba-4402-b258-ba4b4ac2f7dd","Type":"ContainerStarted","Data":"6ccbbe59ea433e132fec3c9d314471d754076df3525a223a211b19e49dba1d91"} Apr 17 14:21:00.458221 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.458184 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-119.ec2.internal" podStartSLOduration=14.458174061 podStartE2EDuration="14.458174061s" podCreationTimestamp="2026-04-17 14:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:21:00.457929332 +0000 UTC m=+15.593936881" watchObservedRunningTime="2026-04-17 14:21:00.458174061 +0000 UTC m=+15.594181609" Apr 17 14:21:00.475233 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.475188 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r8vfn" podStartSLOduration=3.293663664 podStartE2EDuration="15.475176727s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:47.590335566 +0000 UTC m=+2.726343108" lastFinishedPulling="2026-04-17 14:20:59.771848629 +0000 UTC m=+14.907856171" observedRunningTime="2026-04-17 14:21:00.474615568 +0000 UTC m=+15.610623116" watchObservedRunningTime="2026-04-17 14:21:00.475176727 +0000 UTC m=+15.611184278" Apr 17 14:21:00.487507 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.487463 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nz4xc" podStartSLOduration=3.36695114 podStartE2EDuration="15.487448771s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:47.615408005 +0000 UTC m=+2.751415545" lastFinishedPulling="2026-04-17 14:20:59.735905631 +0000 UTC m=+14.871913176" observedRunningTime="2026-04-17 14:21:00.486799675 +0000 UTC m=+15.622807222" watchObservedRunningTime="2026-04-17 14:21:00.487448771 +0000 UTC m=+15.623456319" Apr 17 14:21:00.531691 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.531653 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s6jws" podStartSLOduration=3.276855998 podStartE2EDuration="15.531639903s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:47.576365518 +0000 UTC m=+2.712373050" lastFinishedPulling="2026-04-17 14:20:59.831149416 +0000 UTC m=+14.967156955" observedRunningTime="2026-04-17 14:21:00.531316765 +0000 UTC m=+15.667324314" watchObservedRunningTime="2026-04-17 14:21:00.531639903 +0000 UTC m=+15.667647450" Apr 17 14:21:00.543964 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.543915 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gqkmm" podStartSLOduration=3.412205254 podStartE2EDuration="15.543899556s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:47.604196105 +0000 UTC m=+2.740203635" lastFinishedPulling="2026-04-17 14:20:59.735890406 +0000 UTC m=+14.871897937" observedRunningTime="2026-04-17 14:21:00.543790151 +0000 UTC m=+15.679797699" watchObservedRunningTime="2026-04-17 14:21:00.543899556 +0000 UTC m=+15.679907105" Apr 17 14:21:00.556298 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:00.556234 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d8pjn" podStartSLOduration=3.441362894 podStartE2EDuration="15.556219906s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:47.621029724 +0000 UTC m=+2.757037255" lastFinishedPulling="2026-04-17 14:20:59.73588673 +0000 UTC m=+14.871894267" observedRunningTime="2026-04-17 14:21:00.556102113 +0000 UTC m=+15.692109660" watchObservedRunningTime="2026-04-17 14:21:00.556219906 +0000 UTC m=+15.692227510" Apr 17 14:21:01.367899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:01.367870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:01.368080 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:01.367870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:01.368080 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:01.367988 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:01.368193 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:01.368092 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:01.455753 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:01.455672 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8mjfq" event={"ID":"419f4063-d963-4db2-b104-57f7214aaee2","Type":"ContainerStarted","Data":"1b9a76f5bffc51a47e1f6cf339bb11b904ea3f1c63f1cf12c4a537406db123f9"} Apr 17 14:21:01.469446 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:01.469397 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8mjfq" podStartSLOduration=5.419425242 podStartE2EDuration="16.469382951s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:48.692708433 +0000 UTC m=+3.828715970" lastFinishedPulling="2026-04-17 14:20:59.742666152 +0000 UTC m=+14.878673679" observedRunningTime="2026-04-17 14:21:01.468793881 +0000 UTC m=+16.604801430" watchObservedRunningTime="2026-04-17 14:21:01.469382951 +0000 UTC m=+16.605390500" Apr 17 14:21:03.100416 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:03.100380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:03.100971 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.100534 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:03.100971 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.100602 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:21:19.100587434 +0000 UTC m=+34.236594965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:03.201167 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:03.201134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:03.201446 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.201302 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:03.201446 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.201322 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:03.201446 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.201331 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zng67 for pod openshift-network-diagnostics/network-check-target-xxhjb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:03.201446 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.201381 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67 podName:d5fae135-d20e-469d-bb28-4d7236b5f86a nodeName:}" failed. No retries permitted until 2026-04-17 14:21:19.201367353 +0000 UTC m=+34.337374880 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zng67" (UniqueName: "kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67") pod "network-check-target-xxhjb" (UID: "d5fae135-d20e-469d-bb28-4d7236b5f86a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:03.368518 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:03.368437 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:03.368518 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:03.368465 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:03.368736 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.368558 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:03.368736 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:03.368686 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:05.103615 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.103589 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:21:05.301622 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.301378 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:21:05.302000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.301985 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:21:05.324042 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.323967 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:21:05.103609836Z","UUID":"9b92cb2c-382e-4187-bb5e-4048b66b1365","Handler":null,"Name":"","Endpoint":""} Apr 17 14:21:05.325660 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.325640 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:21:05.325759 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.325666 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:21:05.369533 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.369508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:05.369650 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.369550 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:05.369650 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:05.369620 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:05.369751 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:05.369727 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:05.464384 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.464358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"d8346101783fc533d65abd9d00336199eff76aa5ff2411d1ed389995975a1437"} Apr 17 14:21:05.464487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.464390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"0fe7f2d8256fca3807eba61013360058457d893b1d8e9bed38879958058cdae2"} Apr 17 14:21:05.464487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.464401 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"a432b6544fcb566337919b4191762858595d116a03039377013c6bf06717ff75"} Apr 17 14:21:05.464487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.464409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"47d2ada148720e7fdca5c685ba53eec80cd219b42fe3954435e1fe081b916a64"} Apr 17 14:21:05.464487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.464418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"cee0ab54047a401712b22438ab9534460e65d172a6e608082a92a9d0935700ef"} Apr 17 14:21:05.465908 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:05.465840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" event={"ID":"f5eccb1a-2f8e-47ca-8e8e-327f03634196","Type":"ContainerStarted","Data":"9fbbfc5a3ec758685780462638f08b805366e8fd992f1722df87ae36440d9066"} Apr 17 14:21:06.469296 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:06.469195 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" event={"ID":"f5eccb1a-2f8e-47ca-8e8e-327f03634196","Type":"ContainerStarted","Data":"a7c90a6e03cfc9ad9ca53a150e25538db22705e3de25145ec6bb365537a23319"} Apr 17 14:21:06.471434 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:06.471410 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"bf83820f21e8799361abffbc94e0e20a4ff37ed08b60a1ade929799420ab27c9"} Apr 17 14:21:06.484229 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:06.484179 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvkwh" podStartSLOduration=2.915756157 podStartE2EDuration="21.484163936s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:47.598265185 +0000 UTC m=+2.734272717" lastFinishedPulling="2026-04-17 14:21:06.166672956 +0000 UTC m=+21.302680496" observedRunningTime="2026-04-17 14:21:06.48405394 +0000 UTC m=+21.620061487" watchObservedRunningTime="2026-04-17 14:21:06.484163936 +0000 UTC m=+21.620171485" Apr 17 14:21:07.275478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:07.275435 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:21:07.275769 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:07.275563 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:21:07.276077 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:07.276048 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nz4xc" Apr 17 14:21:07.368137 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:07.368105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:07.368137 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:07.368136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:07.368361 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:07.368235 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:07.368436 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:07.368413 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:08.478805 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:08.478768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"5f21dc7c6d7d700affea6af6e3330e83cc5c081c13da551609a92d77f6597893"} Apr 17 14:21:09.368014 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:09.367989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:09.368168 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:09.367989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:09.368168 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:09.368089 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:09.368240 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:09.368173 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:10.484327 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:10.484071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" event={"ID":"25426a81-851e-4273-9383-f90518fec0e7","Type":"ContainerStarted","Data":"ac918878c06361c658c3dce62a328b208868196089bbceded465765a09db8af7"} Apr 17 14:21:10.484327 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:10.484342 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:21:10.485786 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:10.485762 2577 generic.go:358] "Generic (PLEG): container finished" podID="e40076ff-ba56-43e8-88a4-8c25998b6668" containerID="e6deb5d4c77555a59eab822b22e697479a478fb934b916905166e2c6eda9a8e9" exitCode=0 Apr 17 14:21:10.485904 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:10.485803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerDied","Data":"e6deb5d4c77555a59eab822b22e697479a478fb934b916905166e2c6eda9a8e9"} Apr 17 14:21:10.499301 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:10.499260 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:21:10.514389 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:10.513304 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" podStartSLOduration=9.308440379 podStartE2EDuration="25.51326641s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:48.719922855 +0000 UTC m=+3.855930387" lastFinishedPulling="2026-04-17 14:21:04.924748892 +0000 UTC m=+20.060756418" observedRunningTime="2026-04-17 14:21:10.512402329 +0000 UTC m=+25.648409877" watchObservedRunningTime="2026-04-17 14:21:10.51326641 +0000 UTC m=+25.649273958" Apr 17 14:21:11.368676 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.368649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:11.368831 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.368656 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:11.368831 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:11.368783 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:11.369021 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:11.368856 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:11.488566 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.488535 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:21:11.488909 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.488577 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:21:11.501169 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.501145 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:21:11.702681 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.702610 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9qc7k"] Apr 17 14:21:11.702826 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.702726 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:11.702905 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:11.702844 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:11.705367 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.705331 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xxhjb"] Apr 17 14:21:11.705463 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:11.705417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:11.705531 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:11.705507 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:12.492079 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:12.492046 2577 generic.go:358] "Generic (PLEG): container finished" podID="e40076ff-ba56-43e8-88a4-8c25998b6668" containerID="f2a5065712d178daadd1c52a9650ea310ae1bcbe7927c3222f5e4562cd90c665" exitCode=0 Apr 17 14:21:12.492523 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:12.492132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerDied","Data":"f2a5065712d178daadd1c52a9650ea310ae1bcbe7927c3222f5e4562cd90c665"} Apr 17 14:21:13.367973 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:13.367940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:13.368148 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:13.368042 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:13.368148 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:13.368095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:13.368241 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:13.368179 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:14.498036 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:14.498002 2577 generic.go:358] "Generic (PLEG): container finished" podID="e40076ff-ba56-43e8-88a4-8c25998b6668" containerID="a156fc8636de37e2a20d03d48458a9c5b4d1ffb2da1b6cae26ae1d89c3be8a96" exitCode=0 Apr 17 14:21:14.498412 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:14.498053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerDied","Data":"a156fc8636de37e2a20d03d48458a9c5b4d1ffb2da1b6cae26ae1d89c3be8a96"} Apr 17 14:21:15.368848 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:15.368664 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:15.369016 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:15.368744 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:15.369016 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:15.368937 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:15.369016 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:15.368982 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:17.368034 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.368004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:17.368576 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:17.368114 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:21:17.368576 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.368168 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:17.368576 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:17.368249 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xxhjb" podUID="d5fae135-d20e-469d-bb28-4d7236b5f86a" Apr 17 14:21:17.683722 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.683694 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-119.ec2.internal" event="NodeReady" Apr 17 14:21:17.683891 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.683845 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:21:17.715528 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.715498 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv"] Apr 17 14:21:17.736435 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.736400 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-74f7cd55bf-bgvtb"] Apr 17 14:21:17.736656 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.736630 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:17.739598 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.739516 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-h8hrw\"" Apr 17 14:21:17.739598 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.739516 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 14:21:17.739828 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.739811 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 14:21:17.750731 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.750693 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67c4d4c694-tkblr"] Apr 17 14:21:17.766020 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.765998 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v"] Apr 17 14:21:17.766159 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.766129 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.766236 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.766137 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.768971 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.768945 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:21:17.769496 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.769264 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:21:17.769602 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.769556 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:21:17.769882 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.769815 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5dpr9\"" Apr 17 14:21:17.780109 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.780083 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs"] Apr 17 14:21:17.780249 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.780232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:17.783286 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.783243 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 14:21:17.783385 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.783251 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 14:21:17.783385 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.783256 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 14:21:17.783491 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.783455 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 14:21:17.783593 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.783566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7kbdf\"" Apr 17 14:21:17.786899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.786814 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:21:17.798477 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.798456 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl"] Apr 17 14:21:17.798607 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.798586 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:17.801733 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.801685 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 14:21:17.813203 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.813180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4a326186-1f33-45e3-bf03-b51a1846d9da-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:17.813321 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.813221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:17.821289 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.821257 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv"] Apr 17 14:21:17.821289 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.821290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v"] Apr 17 14:21:17.821424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.821300 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dxkfs"] Apr 17 14:21:17.821424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.821401 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:17.824355 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.824338 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 14:21:17.824446 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.824375 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 14:21:17.824446 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.824412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 14:21:17.824711 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.824691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 14:21:17.834993 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.834974 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wf85s"] Apr 17 14:21:17.835322 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.835119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:17.837689 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.837672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w9wr2\"" Apr 17 14:21:17.837912 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.837894 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:21:17.837912 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.837905 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:21:17.839827 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.839810 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:21:17.852284 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.852254 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74f7cd55bf-bgvtb"] Apr 17 14:21:17.852350 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.852299 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67c4d4c694-tkblr"] Apr 17 14:21:17.852350 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.852346 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dxkfs"] Apr 17 14:21:17.852408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.852357 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs"] Apr 17 14:21:17.852408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.852369 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wf85s"] Apr 17 14:21:17.852408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.852381 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl"] Apr 17 14:21:17.852408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.852384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:17.855143 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.855124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:21:17.855221 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.855187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rkhth\"" Apr 17 14:21:17.855295 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.855219 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:21:17.913763 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-hub\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:17.913931 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913778 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-bound-sa-token\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.913931 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-image-registry-private-configuration\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.913931 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4a326186-1f33-45e3-bf03-b51a1846d9da-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:17.913931 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:17.913931 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xr65\" (UniqueName: \"kubernetes.io/projected/2a3933bd-a176-4139-9240-d7c0a91457a1-kube-api-access-6xr65\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:17.913931 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-bound-sa-token\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.914206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.913980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.914206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3619d63b-c5a8-490c-89ac-40affc59fe8b-tmp\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:17.914206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:17.914206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xtss\" (UniqueName: \"kubernetes.io/projected/1eeecdb3-0f9a-46a9-ad05-83febe93c64d-kube-api-access-9xtss\") pod \"managed-serviceaccount-addon-agent-b854b576f-dnw4v\" (UID: \"1eeecdb3-0f9a-46a9-ad05-83febe93c64d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:17.914206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp8dl\" (UniqueName: \"kubernetes.io/projected/3619d63b-c5a8-490c-89ac-40affc59fe8b-kube-api-access-vp8dl\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:17.914206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-certificates\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.914206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-trusted-ca\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-certificates\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-installation-pull-secrets\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3619d63b-c5a8-490c-89ac-40affc59fe8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cf8j\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-kube-api-access-6cf8j\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914438 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:17.914467 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vv4p\" (UniqueName: \"kubernetes.io/projected/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-kube-api-access-2vv4p\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-trusted-ca\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.914568 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:17.914557 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:21:18.414536803 +0000 UTC m=+33.550544338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4a326186-1f33-45e3-bf03-b51a1846d9da-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-image-registry-private-configuration\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-installation-pull-secrets\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57c9c507-2456-4cd3-8dab-9666a43e11af-ca-trust-extracted\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rks5l\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-kube-api-access-rks5l\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1eeecdb3-0f9a-46a9-ad05-83febe93c64d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b854b576f-dnw4v\" (UID: \"1eeecdb3-0f9a-46a9-ad05-83febe93c64d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-ca\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-ca-trust-extracted\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:17.915081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:17.914900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2a3933bd-a176-4139-9240-d7c0a91457a1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.016316 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-certificates\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.016316 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44jf\" (UniqueName: \"kubernetes.io/projected/059af305-b8f4-4631-aba4-e42fe75f0259-kube-api-access-g44jf\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.016316 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-trusted-ca\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-certificates\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-installation-pull-secrets\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3619d63b-c5a8-490c-89ac-40affc59fe8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cf8j\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-kube-api-access-6cf8j\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vv4p\" (UniqueName: \"kubernetes.io/projected/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-kube-api-access-2vv4p\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.016564 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-trusted-ca\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.016583 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.016585 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-image-registry-private-configuration\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.016640 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:18.516621261 +0000 UTC m=+33.652628789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-installation-pull-secrets\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57c9c507-2456-4cd3-8dab-9666a43e11af-ca-trust-extracted\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rks5l\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-kube-api-access-rks5l\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1eeecdb3-0f9a-46a9-ad05-83febe93c64d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b854b576f-dnw4v\" (UID: \"1eeecdb3-0f9a-46a9-ad05-83febe93c64d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-ca\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-ca-trust-extracted\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2a3933bd-a176-4139-9240-d7c0a91457a1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-certificates\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016865 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-hub\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-bound-sa-token\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-certificates\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.017072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-image-registry-private-configuration\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.016974 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.016983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.017031 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:21:18.517014683 +0000 UTC m=+33.653022225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xr65\" (UniqueName: \"kubernetes.io/projected/2a3933bd-a176-4139-9240-d7c0a91457a1-kube-api-access-6xr65\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-bound-sa-token\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3619d63b-c5a8-490c-89ac-40affc59fe8b-tmp\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017207 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/059af305-b8f4-4631-aba4-e42fe75f0259-config-volume\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/059af305-b8f4-4631-aba4-e42fe75f0259-tmp-dir\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xtss\" (UniqueName: \"kubernetes.io/projected/1eeecdb3-0f9a-46a9-ad05-83febe93c64d-kube-api-access-9xtss\") pod \"managed-serviceaccount-addon-agent-b854b576f-dnw4v\" (UID: \"1eeecdb3-0f9a-46a9-ad05-83febe93c64d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp8dl\" (UniqueName: \"kubernetes.io/projected/3619d63b-c5a8-490c-89ac-40affc59fe8b-kube-api-access-vp8dl\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:18.017795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-trusted-ca\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.018412 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-trusted-ca\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.018412 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57c9c507-2456-4cd3-8dab-9666a43e11af-ca-trust-extracted\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.018412 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.017930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3619d63b-c5a8-490c-89ac-40affc59fe8b-tmp\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:18.018548 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.018487 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2a3933bd-a176-4139-9240-d7c0a91457a1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.018829 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.018643 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:18.018829 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.018663 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:21:18.018829 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.018765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-ca-trust-extracted\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.019237 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.019088 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:21:18.519066285 +0000 UTC m=+33.655073825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:21:18.022235 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.022210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-ca\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.022235 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.022226 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.022753 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.022735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3619d63b-c5a8-490c-89ac-40affc59fe8b-klusterlet-config\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:18.023136 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.023114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-installation-pull-secrets\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.023243 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.023223 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-image-registry-private-configuration\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.023443 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.023417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.023551 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.023486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-installation-pull-secrets\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.024249 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.024228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2a3933bd-a176-4139-9240-d7c0a91457a1-hub\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.027724 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.027682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-image-registry-private-configuration\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.028514 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.028475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rks5l\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-kube-api-access-rks5l\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.029124 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.028979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vv4p\" (UniqueName: \"kubernetes.io/projected/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-kube-api-access-2vv4p\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:18.029418 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.029399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-bound-sa-token\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.029534 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.029513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp8dl\" (UniqueName: \"kubernetes.io/projected/3619d63b-c5a8-490c-89ac-40affc59fe8b-kube-api-access-vp8dl\") pod \"klusterlet-addon-workmgr-77f7fb958d-plbgs\" (UID: \"3619d63b-c5a8-490c-89ac-40affc59fe8b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:18.029666 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.029638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xr65\" (UniqueName: \"kubernetes.io/projected/2a3933bd-a176-4139-9240-d7c0a91457a1-kube-api-access-6xr65\") pod \"cluster-proxy-proxy-agent-f845cc5fd-kb5zl\" (UID: \"2a3933bd-a176-4139-9240-d7c0a91457a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.031895 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.031695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1eeecdb3-0f9a-46a9-ad05-83febe93c64d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b854b576f-dnw4v\" (UID: \"1eeecdb3-0f9a-46a9-ad05-83febe93c64d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:18.031895 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.031712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xtss\" (UniqueName: \"kubernetes.io/projected/1eeecdb3-0f9a-46a9-ad05-83febe93c64d-kube-api-access-9xtss\") pod \"managed-serviceaccount-addon-agent-b854b576f-dnw4v\" (UID: \"1eeecdb3-0f9a-46a9-ad05-83febe93c64d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:18.033874 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.033828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-bound-sa-token\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.034123 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.034105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cf8j\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-kube-api-access-6cf8j\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.106444 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.106412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" Apr 17 14:21:18.114138 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.114110 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:18.118474 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.118450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.118575 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.118550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/059af305-b8f4-4631-aba4-e42fe75f0259-config-volume\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.118635 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.118578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/059af305-b8f4-4631-aba4-e42fe75f0259-tmp-dir\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.118635 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.118616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g44jf\" (UniqueName: \"kubernetes.io/projected/059af305-b8f4-4631-aba4-e42fe75f0259-kube-api-access-g44jf\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.118734 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.118617 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:18.118787 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.118756 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:18.618736528 +0000 UTC m=+33.754744055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:21:18.119151 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.119128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/059af305-b8f4-4631-aba4-e42fe75f0259-tmp-dir\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.119543 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.119495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/059af305-b8f4-4631-aba4-e42fe75f0259-config-volume\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.128173 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.128141 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44jf\" (UniqueName: \"kubernetes.io/projected/059af305-b8f4-4631-aba4-e42fe75f0259-kube-api-access-g44jf\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.130072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.130057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:21:18.277126 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.277074 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs"] Apr 17 14:21:18.278942 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:21:18.278916 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3619d63b_c5a8_490c_89ac_40affc59fe8b.slice/crio-633c504e248fa0bce9268b3e2e055f0d9e1e8d4a3d8897fbb36e6e1851943768 WatchSource:0}: Error finding container 633c504e248fa0bce9268b3e2e055f0d9e1e8d4a3d8897fbb36e6e1851943768: Status 404 returned error can't find the container with id 633c504e248fa0bce9268b3e2e055f0d9e1e8d4a3d8897fbb36e6e1851943768 Apr 17 14:21:18.284220 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.284194 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v"] Apr 17 14:21:18.289433 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:21:18.288238 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eeecdb3_0f9a_46a9_ad05_83febe93c64d.slice/crio-40fb14d48ccf0659346848f3f4f10b7f3c6d0d235a6ff77a25c5430f2dc9f25c WatchSource:0}: Error finding container 40fb14d48ccf0659346848f3f4f10b7f3c6d0d235a6ff77a25c5430f2dc9f25c: Status 404 returned error can't find the container with id 40fb14d48ccf0659346848f3f4f10b7f3c6d0d235a6ff77a25c5430f2dc9f25c Apr 17 14:21:18.304508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.304487 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl"] Apr 17 14:21:18.306834 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:21:18.306813 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a3933bd_a176_4139_9240_d7c0a91457a1.slice/crio-5ac0b51af253c4b6b9c3c9e77eb75840d72c09420260c6d8d999e7669fcbd68a WatchSource:0}: Error finding container 5ac0b51af253c4b6b9c3c9e77eb75840d72c09420260c6d8d999e7669fcbd68a: Status 404 returned error can't find the container with id 5ac0b51af253c4b6b9c3c9e77eb75840d72c09420260c6d8d999e7669fcbd68a Apr 17 14:21:18.422534 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.422495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:18.422922 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.422641 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:21:18.422922 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.422705 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:21:19.422689069 +0000 UTC m=+34.558696595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:21:18.506524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.506480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" event={"ID":"2a3933bd-a176-4139-9240-d7c0a91457a1","Type":"ContainerStarted","Data":"5ac0b51af253c4b6b9c3c9e77eb75840d72c09420260c6d8d999e7669fcbd68a"} Apr 17 14:21:18.507580 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.507549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" event={"ID":"3619d63b-c5a8-490c-89ac-40affc59fe8b","Type":"ContainerStarted","Data":"633c504e248fa0bce9268b3e2e055f0d9e1e8d4a3d8897fbb36e6e1851943768"} Apr 17 14:21:18.508613 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.508591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" event={"ID":"1eeecdb3-0f9a-46a9-ad05-83febe93c64d","Type":"ContainerStarted","Data":"40fb14d48ccf0659346848f3f4f10b7f3c6d0d235a6ff77a25c5430f2dc9f25c"} Apr 17 14:21:18.523103 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.523077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:18.523251 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.523131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:18.523251 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.523167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:18.523251 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523242 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:18.523448 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523257 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:21:18.523448 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523293 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:18.523448 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523296 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:18.523448 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523328 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:21:18.523448 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523343 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:21:19.523309881 +0000 UTC m=+34.659317424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:21:18.523448 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523392 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:19.523379107 +0000 UTC m=+34.659386646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:21:18.523448 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.523414 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:21:19.523405403 +0000 UTC m=+34.659412942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:21:18.624629 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:18.624598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:18.624784 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.624738 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:18.624822 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:18.624800 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:19.624784444 +0000 UTC m=+34.760791985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:21:19.129583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.129551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:19.129753 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.129685 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:19.129753 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.129738 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:21:51.129725068 +0000 UTC m=+66.265732594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:21:19.230084 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.230046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:19.230300 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.230254 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:21:19.230395 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.230303 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:21:19.230395 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.230318 2577 projected.go:194] Error preparing data for projected volume kube-api-access-zng67 for pod openshift-network-diagnostics/network-check-target-xxhjb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:19.230491 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.230432 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67 podName:d5fae135-d20e-469d-bb28-4d7236b5f86a nodeName:}" failed. No retries permitted until 2026-04-17 14:21:51.230412095 +0000 UTC m=+66.366419637 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zng67" (UniqueName: "kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67") pod "network-check-target-xxhjb" (UID: "d5fae135-d20e-469d-bb28-4d7236b5f86a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:21:19.372841 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.372808 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:19.373013 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.372926 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:19.378331 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.378306 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:21:19.378555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.378538 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:21:19.378787 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.378740 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qbxgh\"" Apr 17 14:21:19.379869 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.378947 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:21:19.379869 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.379134 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hxvwz\"" Apr 17 14:21:19.432803 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.432766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:19.433230 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.432914 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:21:19.433230 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.432987 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:21:21.432965978 +0000 UTC m=+36.568973517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.534181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.534268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.534327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.534512 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.534570 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:21:21.53455081 +0000 UTC m=+36.670558360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.534954 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.534969 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.535012 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:21:21.534997006 +0000 UTC m=+36.671004536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.535388 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.535400 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:21:19.535480 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.535439 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:21.535426591 +0000 UTC m=+36.671434120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:21:19.636010 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:19.635854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:19.636166 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.636015 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:19.636166 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:19.636090 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:21.636070949 +0000 UTC m=+36.772078475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:21:21.451876 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:21.451832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:21.452333 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.452035 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:21:21.452333 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.452138 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:21:25.452114901 +0000 UTC m=+40.588122427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:21:21.553362 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:21.553293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:21.553362 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:21.553349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:21.553376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553442 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553467 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553477 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553514 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553533 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:21:25.553519721 +0000 UTC m=+40.689527247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553534 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:21:21.553554 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553549 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:21:25.553541209 +0000 UTC m=+40.689548737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:21:21.553830 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.553600 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:25.553581002 +0000 UTC m=+40.689588544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:21:21.654068 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:21.654031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:21.654243 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.654205 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:21.654326 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:21.654299 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:25.65426368 +0000 UTC m=+40.790271211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:21:22.791701 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:22.791590 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-zjwv7"] Apr 17 14:21:22.810295 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:22.810245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:22.810989 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:22.810940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zjwv7"] Apr 17 14:21:22.813538 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:22.813518 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:21:22.965305 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:22.965259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7c4ea84-7290-4b62-b947-cbedb375e0f9-kubelet-config\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:22.965461 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:22.965325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7c4ea84-7290-4b62-b947-cbedb375e0f9-dbus\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:22.965461 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:22.965369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7c4ea84-7290-4b62-b947-cbedb375e0f9-original-pull-secret\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:23.066898 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:23.066814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7c4ea84-7290-4b62-b947-cbedb375e0f9-kubelet-config\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:23.066898 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:23.066876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7c4ea84-7290-4b62-b947-cbedb375e0f9-dbus\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:23.067113 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:23.066941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c7c4ea84-7290-4b62-b947-cbedb375e0f9-kubelet-config\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:23.067113 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:23.067046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7c4ea84-7290-4b62-b947-cbedb375e0f9-original-pull-secret\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:23.067113 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:23.067102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c7c4ea84-7290-4b62-b947-cbedb375e0f9-dbus\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:23.069527 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:23.069508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c7c4ea84-7290-4b62-b947-cbedb375e0f9-original-pull-secret\") pod \"global-pull-secret-syncer-zjwv7\" (UID: \"c7c4ea84-7290-4b62-b947-cbedb375e0f9\") " pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:23.122629 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:23.122598 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-zjwv7" Apr 17 14:21:25.486169 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:25.486133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:25.486665 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.486313 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:21:25.486665 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.486390 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:21:33.48636986 +0000 UTC m=+48.622377390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:21:25.587439 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:25.587396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:25.587635 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:25.587462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:25.587635 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587564 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:25.587635 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:25.587575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:25.587635 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587574 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:25.587809 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587634 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:25.587809 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587653 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:21:25.587809 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587636 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:21:25.587809 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587643 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:21:33.587625524 +0000 UTC m=+48.723633071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:21:25.587809 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587737 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:21:33.587724701 +0000 UTC m=+48.723732233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:21:25.587809 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.587753 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:33.587745101 +0000 UTC m=+48.723752641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:21:25.688766 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:25.688729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:25.688923 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.688897 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:25.688987 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:25.688964 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:33.688943483 +0000 UTC m=+48.824951019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:21:25.829867 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:25.829838 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-zjwv7"] Apr 17 14:21:25.833903 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:21:25.833866 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c4ea84_7290_4b62_b947_cbedb375e0f9.slice/crio-c28d823f8ad724c3eec47973522bcf9ca2fef8f6eeb06c3c3eabf36721faa449 WatchSource:0}: Error finding container c28d823f8ad724c3eec47973522bcf9ca2fef8f6eeb06c3c3eabf36721faa449: Status 404 returned error can't find the container with id c28d823f8ad724c3eec47973522bcf9ca2fef8f6eeb06c3c3eabf36721faa449 Apr 17 14:21:26.526906 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.526861 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zjwv7" event={"ID":"c7c4ea84-7290-4b62-b947-cbedb375e0f9","Type":"ContainerStarted","Data":"c28d823f8ad724c3eec47973522bcf9ca2fef8f6eeb06c3c3eabf36721faa449"} Apr 17 14:21:26.528207 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.528179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" event={"ID":"2a3933bd-a176-4139-9240-d7c0a91457a1","Type":"ContainerStarted","Data":"e65ca0074e62ace400df80b6d5fe16c2bbe7211aff0397f2f49071e35123c48c"} Apr 17 14:21:26.529323 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.529294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" event={"ID":"3619d63b-c5a8-490c-89ac-40affc59fe8b","Type":"ContainerStarted","Data":"02e451cb0d96ee061118918246e55cda63379c5b115019ec419de02a1e79cc3e"} Apr 17 14:21:26.529568 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.529543 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:26.530623 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.530601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" event={"ID":"1eeecdb3-0f9a-46a9-ad05-83febe93c64d","Type":"ContainerStarted","Data":"4ca9a31f13ef2eaa965d2ba0344450131df8cf8b1d166aac6be374551142b980"} Apr 17 14:21:26.531402 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.531385 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:21:26.533492 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.533471 2577 generic.go:358] "Generic (PLEG): container finished" podID="e40076ff-ba56-43e8-88a4-8c25998b6668" containerID="56478a1f86f97df509a8c0ffc3aaad3279b0e919bf2bf41ebdf962d518c9b940" exitCode=0 Apr 17 14:21:26.533583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.533499 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerDied","Data":"56478a1f86f97df509a8c0ffc3aaad3279b0e919bf2bf41ebdf962d518c9b940"} Apr 17 14:21:26.545888 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.545843 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" podStartSLOduration=19.119083968 podStartE2EDuration="26.545828485s" podCreationTimestamp="2026-04-17 14:21:00 +0000 UTC" firstStartedPulling="2026-04-17 14:21:18.280942161 +0000 UTC m=+33.416949687" lastFinishedPulling="2026-04-17 14:21:25.70768667 +0000 UTC m=+40.843694204" observedRunningTime="2026-04-17 14:21:26.544753032 +0000 UTC m=+41.680760584" watchObservedRunningTime="2026-04-17 14:21:26.545828485 +0000 UTC m=+41.681836032" Apr 17 14:21:26.559672 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:26.559583 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" podStartSLOduration=19.150382841 podStartE2EDuration="26.559569712s" podCreationTimestamp="2026-04-17 14:21:00 +0000 UTC" firstStartedPulling="2026-04-17 14:21:18.290573192 +0000 UTC m=+33.426580718" lastFinishedPulling="2026-04-17 14:21:25.699760047 +0000 UTC m=+40.835767589" observedRunningTime="2026-04-17 14:21:26.558325336 +0000 UTC m=+41.694332885" watchObservedRunningTime="2026-04-17 14:21:26.559569712 +0000 UTC m=+41.695577259" Apr 17 14:21:27.538680 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:27.538627 2577 generic.go:358] "Generic (PLEG): container finished" podID="e40076ff-ba56-43e8-88a4-8c25998b6668" containerID="859d1defac934fef7d72b5432d53a7ec6634ca7f50c5fde91d9e7f246a14e498" exitCode=0 Apr 17 14:21:27.539093 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:27.538707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerDied","Data":"859d1defac934fef7d72b5432d53a7ec6634ca7f50c5fde91d9e7f246a14e498"} Apr 17 14:21:28.543417 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:28.543380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" event={"ID":"2a3933bd-a176-4139-9240-d7c0a91457a1","Type":"ContainerStarted","Data":"cf7dfd148c350d03a2e636379339540209c82cefa1b6af23b42cb1b139808912"} Apr 17 14:21:28.543417 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:28.543422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" event={"ID":"2a3933bd-a176-4139-9240-d7c0a91457a1","Type":"ContainerStarted","Data":"bcf3a86a0563b5e9dc7f1445d940c5a47f229a55edd6ba4046d1ec657c301d13"} Apr 17 14:21:28.546557 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:28.546504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25j4z" event={"ID":"e40076ff-ba56-43e8-88a4-8c25998b6668","Type":"ContainerStarted","Data":"c6088e0f7ea1d8269dde11d7d110911e73627dae6b9a211a465afbfaf4b549e1"} Apr 17 14:21:28.562783 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:28.562743 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" podStartSLOduration=18.532552347 podStartE2EDuration="28.5627307s" podCreationTimestamp="2026-04-17 14:21:00 +0000 UTC" firstStartedPulling="2026-04-17 14:21:18.308374286 +0000 UTC m=+33.444381812" lastFinishedPulling="2026-04-17 14:21:28.338552626 +0000 UTC m=+43.474560165" observedRunningTime="2026-04-17 14:21:28.56123765 +0000 UTC m=+43.697245198" watchObservedRunningTime="2026-04-17 14:21:28.5627307 +0000 UTC m=+43.698738248" Apr 17 14:21:28.583411 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:28.583371 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-25j4z" podStartSLOduration=5.523856654 podStartE2EDuration="43.583358609s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:20:47.629290068 +0000 UTC m=+2.765297595" lastFinishedPulling="2026-04-17 14:21:25.688792015 +0000 UTC m=+40.824799550" observedRunningTime="2026-04-17 14:21:28.581415745 +0000 UTC m=+43.717423292" watchObservedRunningTime="2026-04-17 14:21:28.583358609 +0000 UTC m=+43.719366156" Apr 17 14:21:30.552940 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:30.552858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-zjwv7" event={"ID":"c7c4ea84-7290-4b62-b947-cbedb375e0f9","Type":"ContainerStarted","Data":"663ebad28fcb5be886c9b6a930b980ca2571244465a109299a6da9c1d88f9a0c"} Apr 17 14:21:30.568462 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:30.568415 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-zjwv7" podStartSLOduration=4.1896732 podStartE2EDuration="8.568401179s" podCreationTimestamp="2026-04-17 14:21:22 +0000 UTC" firstStartedPulling="2026-04-17 14:21:25.835385328 +0000 UTC m=+40.971392855" lastFinishedPulling="2026-04-17 14:21:30.214113306 +0000 UTC m=+45.350120834" observedRunningTime="2026-04-17 14:21:30.567512909 +0000 UTC m=+45.703520458" watchObservedRunningTime="2026-04-17 14:21:30.568401179 +0000 UTC m=+45.704408726" Apr 17 14:21:33.558790 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:33.558757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:33.559177 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.558907 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:21:33.559177 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.558967 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:21:49.558948613 +0000 UTC m=+64.694956156 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:21:33.659562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:33.659526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:33.659562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:33.659569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:33.659764 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659656 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:33.659764 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659661 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:33.659764 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659677 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:21:33.659764 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659699 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:21:49.659687112 +0000 UTC m=+64.795694638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:21:33.659764 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659719 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:21:49.659706037 +0000 UTC m=+64.795713563 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:21:33.659982 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:33.659808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:33.659982 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659910 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:33.659982 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659922 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:21:33.659982 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.659962 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:21:49.659953044 +0000 UTC m=+64.795960570 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:21:33.760854 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:33.760824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:33.761083 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.760933 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:33.761083 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:33.760992 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:21:49.760979404 +0000 UTC m=+64.896986930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:21:43.508451 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:43.508412 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zns76" Apr 17 14:21:49.588904 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:49.588870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:21:49.589311 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.589026 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:21:49.589311 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.589099 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:22:21.589083068 +0000 UTC m=+96.725090594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:21:49.690313 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:49.690261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:21:49.690440 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:49.690325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:21:49.690440 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:49.690355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:21:49.690440 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690409 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:49.690440 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690432 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:21:49.690440 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690439 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:21:49.690601 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690448 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:21:49.690601 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690459 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:21:49.690601 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690492 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:22:21.690477926 +0000 UTC m=+96.826485451 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:21:49.690601 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690505 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:22:21.690498441 +0000 UTC m=+96.826505967 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:21:49.690601 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.690514 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:22:21.690509229 +0000 UTC m=+96.826516755 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:21:49.791187 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:49.791151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:21:49.791333 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.791316 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:21:49.791392 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:49.791382 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:22:21.791360803 +0000 UTC m=+96.927368340 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:21:51.203193 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.203154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:21:51.206178 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.206158 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:21:51.213525 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:51.213506 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:21:51.213617 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:21:51.213606 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:22:55.213588334 +0000 UTC m=+130.349595880 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : secret "metrics-daemon-secret" not found Apr 17 14:21:51.304562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.304531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:51.307689 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.307671 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:21:51.317187 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.317170 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:21:51.327652 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.327630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zng67\" (UniqueName: \"kubernetes.io/projected/d5fae135-d20e-469d-bb28-4d7236b5f86a-kube-api-access-zng67\") pod \"network-check-target-xxhjb\" (UID: \"d5fae135-d20e-469d-bb28-4d7236b5f86a\") " pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:51.492968 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.492896 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qbxgh\"" Apr 17 14:21:51.500365 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.500335 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:51.607493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:51.607466 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xxhjb"] Apr 17 14:21:51.611024 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:21:51.610998 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5fae135_d20e_469d_bb28_4d7236b5f86a.slice/crio-02a76f667e0937554317951b4d3de6abf64ca185f1bc03340a6c373a4230fcba WatchSource:0}: Error finding container 02a76f667e0937554317951b4d3de6abf64ca185f1bc03340a6c373a4230fcba: Status 404 returned error can't find the container with id 02a76f667e0937554317951b4d3de6abf64ca185f1bc03340a6c373a4230fcba Apr 17 14:21:52.606871 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:52.606827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xxhjb" event={"ID":"d5fae135-d20e-469d-bb28-4d7236b5f86a","Type":"ContainerStarted","Data":"02a76f667e0937554317951b4d3de6abf64ca185f1bc03340a6c373a4230fcba"} Apr 17 14:21:55.615302 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:55.615253 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xxhjb" event={"ID":"d5fae135-d20e-469d-bb28-4d7236b5f86a","Type":"ContainerStarted","Data":"04d7883ff70d2674335b622a3aa40151aaf780f345230673ff4cd54f49a40239"} Apr 17 14:21:55.615761 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:55.615423 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:21:55.635028 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:21:55.634982 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xxhjb" podStartSLOduration=67.233508458 podStartE2EDuration="1m10.634969803s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:21:51.613008863 +0000 UTC m=+66.749016390" lastFinishedPulling="2026-04-17 14:21:55.014470194 +0000 UTC m=+70.150477735" observedRunningTime="2026-04-17 14:21:55.634480004 +0000 UTC m=+70.770487549" watchObservedRunningTime="2026-04-17 14:21:55.634969803 +0000 UTC m=+70.770977376" Apr 17 14:22:21.657352 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:22:21.657307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:22:21.657774 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.657459 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:22:21.657774 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.657537 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:23:25.657521759 +0000 UTC m=+160.793529285 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:22:21.758197 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:22:21.758160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:22:21.758379 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:22:21.758209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:22:21.758379 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:22:21.758236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:22:21.758379 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758329 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:22:21.758379 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758347 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:22:21.758379 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758362 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:22:21.758379 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758367 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:22:21.758564 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758373 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:22:21.758564 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758401 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:23:25.758387592 +0000 UTC m=+160.894395119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:22:21.758564 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758435 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:23:25.758423864 +0000 UTC m=+160.894431390 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:22:21.758564 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.758446 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:23:25.758440034 +0000 UTC m=+160.894447560 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:22:21.859503 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:22:21.859477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:22:21.859642 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.859614 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:22:21.859695 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:21.859685 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:23:25.859670289 +0000 UTC m=+160.995677816 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:22:26.620428 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:22:26.620400 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xxhjb" Apr 17 14:22:55.237005 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:22:55.236943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:22:55.237547 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:55.237108 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:22:55.237547 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:22:55.237191 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs podName:3042fc33-2fc3-4d3d-a248-3855f7eb3a6a nodeName:}" failed. No retries permitted until 2026-04-17 14:24:57.237175606 +0000 UTC m=+252.373183132 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs") pod "network-metrics-daemon-9qc7k" (UID: "3042fc33-2fc3-4d3d-a248-3855f7eb3a6a") : secret "metrics-daemon-secret" not found Apr 17 14:23:20.766653 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:20.766603 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" podUID="4a326186-1f33-45e3-bf03-b51a1846d9da" Apr 17 14:23:20.779348 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:20.779315 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" podUID="57c9c507-2456-4cd3-8dab-9666a43e11af" Apr 17 14:23:20.786414 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:20.786386 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" podUID="7e1725cb-b462-4434-a5db-26e9c8fe0a6d" Apr 17 14:23:20.814028 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:20.814006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:23:20.814127 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:20.814006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:23:20.814177 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:20.814006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:23:20.853075 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:20.853043 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dxkfs" podUID="da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e" Apr 17 14:23:20.862193 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:20.862170 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wf85s" podUID="059af305-b8f4-4631-aba4-e42fe75f0259" Apr 17 14:23:21.816310 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:21.816261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wf85s" Apr 17 14:23:21.816659 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:21.816261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:23:22.396376 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:22.396338 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9qc7k" podUID="3042fc33-2fc3-4d3d-a248-3855f7eb3a6a" Apr 17 14:23:23.463410 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:23.463382 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gqkmm_565f7614-6003-428c-a0bd-ff0f395baa33/dns-node-resolver/0.log" Apr 17 14:23:24.466226 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:24.466197 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d8pjn_26b0d440-2cba-4402-b258-ba4b4ac2f7dd/node-ca/0.log" Apr 17 14:23:25.677573 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:25.677533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:23:25.677945 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.677671 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:23:25.677945 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.677730 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert podName:4a326186-1f33-45e3-bf03-b51a1846d9da nodeName:}" failed. No retries permitted until 2026-04-17 14:25:27.677715043 +0000 UTC m=+282.813722569 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ch6nv" (UID: "4a326186-1f33-45e3-bf03-b51a1846d9da") : secret "networking-console-plugin-cert" not found Apr 17 14:23:25.778769 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:25.778732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") pod \"image-registry-67c4d4c694-tkblr\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:23:25.778908 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:25.778777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:23:25.778908 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:25.778836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") pod \"image-registry-74f7cd55bf-bgvtb\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:23:25.778908 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778883 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:23:25.778908 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778901 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c4d4c694-tkblr: secret "image-registry-tls" not found Apr 17 14:23:25.779033 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778926 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:23:25.779033 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778938 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74f7cd55bf-bgvtb: secret "image-registry-tls" not found Apr 17 14:23:25.779033 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778926 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:23:25.779033 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778965 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls podName:7e1725cb-b462-4434-a5db-26e9c8fe0a6d nodeName:}" failed. No retries permitted until 2026-04-17 14:25:27.778946957 +0000 UTC m=+282.914954483 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls") pod "image-registry-67c4d4c694-tkblr" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d") : secret "image-registry-tls" not found Apr 17 14:23:25.779033 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778977 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls podName:57c9c507-2456-4cd3-8dab-9666a43e11af nodeName:}" failed. No retries permitted until 2026-04-17 14:25:27.778971494 +0000 UTC m=+282.914979059 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls") pod "image-registry-74f7cd55bf-bgvtb" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af") : secret "image-registry-tls" not found Apr 17 14:23:25.779033 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.778988 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert podName:da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e nodeName:}" failed. No retries permitted until 2026-04-17 14:25:27.778982172 +0000 UTC m=+282.914989698 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert") pod "ingress-canary-dxkfs" (UID: "da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e") : secret "canary-serving-cert" not found Apr 17 14:23:25.879400 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:25.879375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:23:25.879564 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.879504 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:23:25.879564 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:25.879558 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls podName:059af305-b8f4-4631-aba4-e42fe75f0259 nodeName:}" failed. No retries permitted until 2026-04-17 14:25:27.879543646 +0000 UTC m=+283.015551172 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls") pod "dns-default-wf85s" (UID: "059af305-b8f4-4631-aba4-e42fe75f0259") : secret "dns-default-metrics-tls" not found Apr 17 14:23:26.530031 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:26.529965 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" podUID="3619d63b-c5a8-490c-89ac-40affc59fe8b" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.12:8000/readyz\": dial tcp 10.132.0.12:8000: connect: connection refused" Apr 17 14:23:26.828500 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:26.828472 2577 generic.go:358] "Generic (PLEG): container finished" podID="3619d63b-c5a8-490c-89ac-40affc59fe8b" containerID="02e451cb0d96ee061118918246e55cda63379c5b115019ec419de02a1e79cc3e" exitCode=1 Apr 17 14:23:26.828924 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:26.828540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" event={"ID":"3619d63b-c5a8-490c-89ac-40affc59fe8b","Type":"ContainerDied","Data":"02e451cb0d96ee061118918246e55cda63379c5b115019ec419de02a1e79cc3e"} Apr 17 14:23:26.828924 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:26.828850 2577 scope.go:117] "RemoveContainer" containerID="02e451cb0d96ee061118918246e55cda63379c5b115019ec419de02a1e79cc3e" Apr 17 14:23:26.829968 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:26.829937 2577 generic.go:358] "Generic (PLEG): container finished" podID="1eeecdb3-0f9a-46a9-ad05-83febe93c64d" containerID="4ca9a31f13ef2eaa965d2ba0344450131df8cf8b1d166aac6be374551142b980" exitCode=255 Apr 17 14:23:26.830057 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:26.829967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" event={"ID":"1eeecdb3-0f9a-46a9-ad05-83febe93c64d","Type":"ContainerDied","Data":"4ca9a31f13ef2eaa965d2ba0344450131df8cf8b1d166aac6be374551142b980"} Apr 17 14:23:26.830264 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:26.830248 2577 scope.go:117] "RemoveContainer" containerID="4ca9a31f13ef2eaa965d2ba0344450131df8cf8b1d166aac6be374551142b980" Apr 17 14:23:27.834196 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:27.834102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" event={"ID":"3619d63b-c5a8-490c-89ac-40affc59fe8b","Type":"ContainerStarted","Data":"ccb41df61cbf38190a78e049749d37ce4b2e7a273ccb2b12b50aa66edb839edb"} Apr 17 14:23:27.834662 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:27.834402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:23:27.835052 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:27.835035 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77f7fb958d-plbgs" Apr 17 14:23:27.835689 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:27.835673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b854b576f-dnw4v" event={"ID":"1eeecdb3-0f9a-46a9-ad05-83febe93c64d","Type":"ContainerStarted","Data":"d5e4db36b75d6ff1a7f93a44cd2435c0271ec1877e121531727f8d303f275995"} Apr 17 14:23:35.369565 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:35.369534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:23:48.196494 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.196454 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74f7cd55bf-bgvtb"] Apr 17 14:23:48.196905 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:23:48.196693 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" podUID="57c9c507-2456-4cd3-8dab-9666a43e11af" Apr 17 14:23:48.317709 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.317673 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-56xpp"] Apr 17 14:23:48.321109 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.321086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.323826 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.323800 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:23:48.323943 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.323924 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j9ndf\"" Apr 17 14:23:48.324013 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.323948 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:23:48.324013 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.323948 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:23:48.325230 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.325196 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:23:48.331257 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.331239 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-56xpp"] Apr 17 14:23:48.455811 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.455726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b966141-1693-476e-a47e-c6acdd7edec5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.455811 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.455759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b966141-1693-476e-a47e-c6acdd7edec5-data-volume\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.455811 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.455785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b966141-1693-476e-a47e-c6acdd7edec5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.455811 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.455801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldkg\" (UniqueName: \"kubernetes.io/projected/4b966141-1693-476e-a47e-c6acdd7edec5-kube-api-access-gldkg\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.456065 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.455886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b966141-1693-476e-a47e-c6acdd7edec5-crio-socket\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.557516 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.557471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b966141-1693-476e-a47e-c6acdd7edec5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.557714 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.557532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b966141-1693-476e-a47e-c6acdd7edec5-data-volume\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.557714 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.557570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b966141-1693-476e-a47e-c6acdd7edec5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.557714 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.557604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gldkg\" (UniqueName: \"kubernetes.io/projected/4b966141-1693-476e-a47e-c6acdd7edec5-kube-api-access-gldkg\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.557714 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.557682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b966141-1693-476e-a47e-c6acdd7edec5-crio-socket\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.557906 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.557789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b966141-1693-476e-a47e-c6acdd7edec5-crio-socket\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.558439 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.558414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b966141-1693-476e-a47e-c6acdd7edec5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.558675 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.558658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b966141-1693-476e-a47e-c6acdd7edec5-data-volume\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.561492 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.561467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b966141-1693-476e-a47e-c6acdd7edec5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.568237 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.568219 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldkg\" (UniqueName: \"kubernetes.io/projected/4b966141-1693-476e-a47e-c6acdd7edec5-kube-api-access-gldkg\") pod \"insights-runtime-extractor-56xpp\" (UID: \"4b966141-1693-476e-a47e-c6acdd7edec5\") " pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.629468 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.629446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-56xpp" Apr 17 14:23:48.740860 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.740792 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-56xpp"] Apr 17 14:23:48.743503 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:23:48.743473 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b966141_1693_476e_a47e_c6acdd7edec5.slice/crio-a6c2553d04102c236d02b38509ed6dbe16da60a63d145bae9b7fed0eed1e7377 WatchSource:0}: Error finding container a6c2553d04102c236d02b38509ed6dbe16da60a63d145bae9b7fed0eed1e7377: Status 404 returned error can't find the container with id a6c2553d04102c236d02b38509ed6dbe16da60a63d145bae9b7fed0eed1e7377 Apr 17 14:23:48.884473 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.884438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56xpp" event={"ID":"4b966141-1693-476e-a47e-c6acdd7edec5","Type":"ContainerStarted","Data":"548213b7065151ce2186316a048fb3bb9b3bed2ae1fc28f8dc6b748ba272915d"} Apr 17 14:23:48.884473 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.884477 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56xpp" event={"ID":"4b966141-1693-476e-a47e-c6acdd7edec5","Type":"ContainerStarted","Data":"a6c2553d04102c236d02b38509ed6dbe16da60a63d145bae9b7fed0eed1e7377"} Apr 17 14:23:48.884660 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.884454 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:23:48.888237 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.888218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:23:48.960685 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960651 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57c9c507-2456-4cd3-8dab-9666a43e11af-ca-trust-extracted\") pod \"57c9c507-2456-4cd3-8dab-9666a43e11af\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " Apr 17 14:23:48.960821 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960694 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-image-registry-private-configuration\") pod \"57c9c507-2456-4cd3-8dab-9666a43e11af\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " Apr 17 14:23:48.960821 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960724 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-installation-pull-secrets\") pod \"57c9c507-2456-4cd3-8dab-9666a43e11af\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " Apr 17 14:23:48.960821 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960750 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-trusted-ca\") pod \"57c9c507-2456-4cd3-8dab-9666a43e11af\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " Apr 17 14:23:48.960821 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960767 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-certificates\") pod \"57c9c507-2456-4cd3-8dab-9666a43e11af\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " Apr 17 14:23:48.960821 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960811 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rks5l\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-kube-api-access-rks5l\") pod \"57c9c507-2456-4cd3-8dab-9666a43e11af\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " Apr 17 14:23:48.961037 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960835 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-bound-sa-token\") pod \"57c9c507-2456-4cd3-8dab-9666a43e11af\" (UID: \"57c9c507-2456-4cd3-8dab-9666a43e11af\") " Apr 17 14:23:48.961037 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.960913 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57c9c507-2456-4cd3-8dab-9666a43e11af-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "57c9c507-2456-4cd3-8dab-9666a43e11af" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:23:48.961233 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.961201 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/57c9c507-2456-4cd3-8dab-9666a43e11af-ca-trust-extracted\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:48.961330 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.961298 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "57c9c507-2456-4cd3-8dab-9666a43e11af" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:23:48.961330 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.961319 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "57c9c507-2456-4cd3-8dab-9666a43e11af" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:23:48.963040 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.963013 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "57c9c507-2456-4cd3-8dab-9666a43e11af" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:23:48.963138 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.963043 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "57c9c507-2456-4cd3-8dab-9666a43e11af" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:23:48.963138 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.963090 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "57c9c507-2456-4cd3-8dab-9666a43e11af" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:23:48.963138 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:48.963083 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-kube-api-access-rks5l" (OuterVolumeSpecName: "kube-api-access-rks5l") pod "57c9c507-2456-4cd3-8dab-9666a43e11af" (UID: "57c9c507-2456-4cd3-8dab-9666a43e11af"). InnerVolumeSpecName "kube-api-access-rks5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:23:49.062144 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.062081 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rks5l\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-kube-api-access-rks5l\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:49.062144 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.062105 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-bound-sa-token\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:49.062144 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.062115 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-image-registry-private-configuration\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:49.062144 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.062125 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/57c9c507-2456-4cd3-8dab-9666a43e11af-installation-pull-secrets\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:49.062144 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.062141 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-trusted-ca\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:49.062438 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.062150 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-certificates\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:49.888075 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.888040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56xpp" event={"ID":"4b966141-1693-476e-a47e-c6acdd7edec5","Type":"ContainerStarted","Data":"b7552b50593813d01ae8b9456bb89a1967a4eae422855799c5cb8271045aeb02"} Apr 17 14:23:49.888075 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.888074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74f7cd55bf-bgvtb" Apr 17 14:23:49.917933 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.917908 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74f7cd55bf-bgvtb"] Apr 17 14:23:49.921596 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:49.921574 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-74f7cd55bf-bgvtb"] Apr 17 14:23:50.070477 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:50.070443 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/57c9c507-2456-4cd3-8dab-9666a43e11af-registry-tls\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:23:51.370986 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:51.370962 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c9c507-2456-4cd3-8dab-9666a43e11af" path="/var/lib/kubelet/pods/57c9c507-2456-4cd3-8dab-9666a43e11af/volumes" Apr 17 14:23:51.894979 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:51.894950 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56xpp" event={"ID":"4b966141-1693-476e-a47e-c6acdd7edec5","Type":"ContainerStarted","Data":"424c24362a81814d36765a6273f81c14691b6e9d7f4c8b8f1fc031bbeb19c40f"} Apr 17 14:23:51.911843 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:23:51.911804 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-56xpp" podStartSLOduration=1.651074327 podStartE2EDuration="3.911790779s" podCreationTimestamp="2026-04-17 14:23:48 +0000 UTC" firstStartedPulling="2026-04-17 14:23:48.795157563 +0000 UTC m=+183.931165089" lastFinishedPulling="2026-04-17 14:23:51.055874012 +0000 UTC m=+186.191881541" observedRunningTime="2026-04-17 14:23:51.911545944 +0000 UTC m=+187.047553492" watchObservedRunningTime="2026-04-17 14:23:51.911790779 +0000 UTC m=+187.047798325" Apr 17 14:24:01.503685 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.503655 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ppx4t"] Apr 17 14:24:01.506601 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.506585 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.510902 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.510874 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:24:01.512187 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.512157 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:24:01.512187 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.512177 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:24:01.512453 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.512193 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:24:01.512866 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.512659 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-c4k9f\"" Apr 17 14:24:01.512866 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.512793 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:24:01.512951 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.512912 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:24:01.656580 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7312e194-131a-4247-8dbf-6ca7a8f6fa14-metrics-client-ca\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656733 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656588 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-wtmp\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656733 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656733 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgbhr\" (UniqueName: \"kubernetes.io/projected/7312e194-131a-4247-8dbf-6ca7a8f6fa14-kube-api-access-jgbhr\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656835 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-sys\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656835 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-root\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656835 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-tls\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656921 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-textfile\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.656921 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.656871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-accelerators-collector-config\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.757764 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.757678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-textfile\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.757764 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.757716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-accelerators-collector-config\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.757924 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.757780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7312e194-131a-4247-8dbf-6ca7a8f6fa14-metrics-client-ca\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.757924 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.757802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-wtmp\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.757924 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.757821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.757924 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.757849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgbhr\" (UniqueName: \"kubernetes.io/projected/7312e194-131a-4247-8dbf-6ca7a8f6fa14-kube-api-access-jgbhr\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758114 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.757996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-wtmp\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758114 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-sys\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758114 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-textfile\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758114 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-sys\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758114 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-root\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758114 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-tls\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758391 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7312e194-131a-4247-8dbf-6ca7a8f6fa14-root\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758430 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-accelerators-collector-config\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.758430 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.758407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7312e194-131a-4247-8dbf-6ca7a8f6fa14-metrics-client-ca\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.760216 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.760196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-tls\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.760361 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.760233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7312e194-131a-4247-8dbf-6ca7a8f6fa14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.765587 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.765569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgbhr\" (UniqueName: \"kubernetes.io/projected/7312e194-131a-4247-8dbf-6ca7a8f6fa14-kube-api-access-jgbhr\") pod \"node-exporter-ppx4t\" (UID: \"7312e194-131a-4247-8dbf-6ca7a8f6fa14\") " pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.815306 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.815267 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ppx4t" Apr 17 14:24:01.823100 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:24:01.823074 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7312e194_131a_4247_8dbf_6ca7a8f6fa14.slice/crio-46f7b8da4c6aa6e1e9fe0c8f60086a22faef429807c8fa3fea7e8e73fc726ec4 WatchSource:0}: Error finding container 46f7b8da4c6aa6e1e9fe0c8f60086a22faef429807c8fa3fea7e8e73fc726ec4: Status 404 returned error can't find the container with id 46f7b8da4c6aa6e1e9fe0c8f60086a22faef429807c8fa3fea7e8e73fc726ec4 Apr 17 14:24:01.920837 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:01.920786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppx4t" event={"ID":"7312e194-131a-4247-8dbf-6ca7a8f6fa14","Type":"ContainerStarted","Data":"46f7b8da4c6aa6e1e9fe0c8f60086a22faef429807c8fa3fea7e8e73fc726ec4"} Apr 17 14:24:02.924898 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:02.924809 2577 generic.go:358] "Generic (PLEG): container finished" podID="7312e194-131a-4247-8dbf-6ca7a8f6fa14" containerID="9f82c0d2b33bf5e37afd8e04eddcc7b0692261e97bf641de6e05b096e659bc1d" exitCode=0 Apr 17 14:24:02.925356 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:02.924898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppx4t" event={"ID":"7312e194-131a-4247-8dbf-6ca7a8f6fa14","Type":"ContainerDied","Data":"9f82c0d2b33bf5e37afd8e04eddcc7b0692261e97bf641de6e05b096e659bc1d"} Apr 17 14:24:03.928772 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:03.928738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppx4t" event={"ID":"7312e194-131a-4247-8dbf-6ca7a8f6fa14","Type":"ContainerStarted","Data":"1785576aaa5f5ff388b62bf09ddafdaec02de1a91cf92d5a0052d6dc85137e7d"} Apr 17 14:24:03.928772 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:03.928774 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ppx4t" event={"ID":"7312e194-131a-4247-8dbf-6ca7a8f6fa14","Type":"ContainerStarted","Data":"f8c6af1328871b8057941681657fe9c5e7565c730cb1c2eb2e4925f0844f1979"} Apr 17 14:24:03.949122 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:03.949074 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ppx4t" podStartSLOduration=2.180482312 podStartE2EDuration="2.949058325s" podCreationTimestamp="2026-04-17 14:24:01 +0000 UTC" firstStartedPulling="2026-04-17 14:24:01.824979994 +0000 UTC m=+196.960987521" lastFinishedPulling="2026-04-17 14:24:02.593556005 +0000 UTC m=+197.729563534" observedRunningTime="2026-04-17 14:24:03.947363636 +0000 UTC m=+199.083371184" watchObservedRunningTime="2026-04-17 14:24:03.949058325 +0000 UTC m=+199.085065873" Apr 17 14:24:18.131884 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:18.131835 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" podUID="2a3933bd-a176-4139-9240-d7c0a91457a1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:24:19.985532 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:19.985494 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67c4d4c694-tkblr"] Apr 17 14:24:19.985880 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:24:19.985683 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" podUID="7e1725cb-b462-4434-a5db-26e9c8fe0a6d" Apr 17 14:24:20.970757 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:20.970719 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:24:20.974727 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:20.974707 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:24:21.010499 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010478 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-image-registry-private-configuration\") pod \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " Apr 17 14:24:21.010818 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010510 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-bound-sa-token\") pod \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " Apr 17 14:24:21.010818 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010544 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cf8j\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-kube-api-access-6cf8j\") pod \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " Apr 17 14:24:21.010818 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010577 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-certificates\") pod \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " Apr 17 14:24:21.010818 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010630 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-installation-pull-secrets\") pod \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " Apr 17 14:24:21.010818 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010661 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-ca-trust-extracted\") pod \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " Apr 17 14:24:21.010818 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010701 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-trusted-ca\") pod \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\" (UID: \"7e1725cb-b462-4434-a5db-26e9c8fe0a6d\") " Apr 17 14:24:21.011106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010938 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7e1725cb-b462-4434-a5db-26e9c8fe0a6d" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:24:21.011106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.010956 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7e1725cb-b462-4434-a5db-26e9c8fe0a6d" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:24:21.011215 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.011113 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7e1725cb-b462-4434-a5db-26e9c8fe0a6d" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:24:21.012826 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.012799 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7e1725cb-b462-4434-a5db-26e9c8fe0a6d" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:24:21.012826 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.012813 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7e1725cb-b462-4434-a5db-26e9c8fe0a6d" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:24:21.012939 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.012868 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7e1725cb-b462-4434-a5db-26e9c8fe0a6d" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:24:21.012939 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.012901 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-kube-api-access-6cf8j" (OuterVolumeSpecName: "kube-api-access-6cf8j") pod "7e1725cb-b462-4434-a5db-26e9c8fe0a6d" (UID: "7e1725cb-b462-4434-a5db-26e9c8fe0a6d"). InnerVolumeSpecName "kube-api-access-6cf8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:24:21.111251 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.111216 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-trusted-ca\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:21.111251 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.111251 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-image-registry-private-configuration\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:21.111251 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.111262 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-bound-sa-token\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:21.111478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.111291 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cf8j\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-kube-api-access-6cf8j\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:21.111478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.111304 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-certificates\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:21.111478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.111316 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-installation-pull-secrets\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:21.111478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.111324 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-ca-trust-extracted\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:21.973190 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:21.973160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c4d4c694-tkblr" Apr 17 14:24:22.009128 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:22.009102 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67c4d4c694-tkblr"] Apr 17 14:24:22.012529 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:22.012498 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67c4d4c694-tkblr"] Apr 17 14:24:22.118428 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:22.118395 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7e1725cb-b462-4434-a5db-26e9c8fe0a6d-registry-tls\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:24:23.371555 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:23.371523 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1725cb-b462-4434-a5db-26e9c8fe0a6d" path="/var/lib/kubelet/pods/7e1725cb-b462-4434-a5db-26e9c8fe0a6d/volumes" Apr 17 14:24:28.131803 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:28.131760 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" podUID="2a3933bd-a176-4139-9240-d7c0a91457a1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:24:38.131247 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:38.131206 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" podUID="2a3933bd-a176-4139-9240-d7c0a91457a1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:24:38.131639 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:38.131303 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" Apr 17 14:24:38.131762 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:38.131732 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"cf7dfd148c350d03a2e636379339540209c82cefa1b6af23b42cb1b139808912"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 14:24:38.131807 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:38.131793 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" podUID="2a3933bd-a176-4139-9240-d7c0a91457a1" containerName="service-proxy" containerID="cri-o://cf7dfd148c350d03a2e636379339540209c82cefa1b6af23b42cb1b139808912" gracePeriod=30 Apr 17 14:24:39.018563 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:39.018528 2577 generic.go:358] "Generic (PLEG): container finished" podID="2a3933bd-a176-4139-9240-d7c0a91457a1" containerID="cf7dfd148c350d03a2e636379339540209c82cefa1b6af23b42cb1b139808912" exitCode=2 Apr 17 14:24:39.018727 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:39.018591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" event={"ID":"2a3933bd-a176-4139-9240-d7c0a91457a1","Type":"ContainerDied","Data":"cf7dfd148c350d03a2e636379339540209c82cefa1b6af23b42cb1b139808912"} Apr 17 14:24:39.018727 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:39.018625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f845cc5fd-kb5zl" event={"ID":"2a3933bd-a176-4139-9240-d7c0a91457a1","Type":"ContainerStarted","Data":"7206cc367ae714b819ed6482799f67ce227d62adb8e671f884187cf9a8de4e60"} Apr 17 14:24:43.533400 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:43.533372 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gqkmm_565f7614-6003-428c-a0bd-ff0f395baa33/dns-node-resolver/0.log" Apr 17 14:24:57.298169 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:57.298123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:24:57.300384 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:57.300364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3042fc33-2fc3-4d3d-a248-3855f7eb3a6a-metrics-certs\") pod \"network-metrics-daemon-9qc7k\" (UID: \"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a\") " pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:24:57.573334 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:57.573311 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hxvwz\"" Apr 17 14:24:57.581634 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:57.581620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qc7k" Apr 17 14:24:57.697782 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:57.697755 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9qc7k"] Apr 17 14:24:57.700579 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:24:57.700547 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3042fc33_2fc3_4d3d_a248_3855f7eb3a6a.slice/crio-4f7a699b34ac712a2b713a0122df200cbbb6cdd00358708d5efdbbbdde216a1c WatchSource:0}: Error finding container 4f7a699b34ac712a2b713a0122df200cbbb6cdd00358708d5efdbbbdde216a1c: Status 404 returned error can't find the container with id 4f7a699b34ac712a2b713a0122df200cbbb6cdd00358708d5efdbbbdde216a1c Apr 17 14:24:58.069735 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:58.069649 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9qc7k" event={"ID":"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a","Type":"ContainerStarted","Data":"4f7a699b34ac712a2b713a0122df200cbbb6cdd00358708d5efdbbbdde216a1c"} Apr 17 14:24:59.073167 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:24:59.073139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9qc7k" event={"ID":"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a","Type":"ContainerStarted","Data":"165c4d84b384c233a7b12613bc626016098fb13efe52ccfd8d2fa59306df7015"} Apr 17 14:25:00.077378 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:00.077342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9qc7k" event={"ID":"3042fc33-2fc3-4d3d-a248-3855f7eb3a6a","Type":"ContainerStarted","Data":"0e733dfb0f8893d7f794b0fd7f27d207d711dda2f6004e0ff0ef6ee1ea817be9"} Apr 17 14:25:00.096406 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:00.096362 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9qc7k" podStartSLOduration=253.923473372 podStartE2EDuration="4m15.096347591s" podCreationTimestamp="2026-04-17 14:20:45 +0000 UTC" firstStartedPulling="2026-04-17 14:24:57.702526279 +0000 UTC m=+252.838533812" lastFinishedPulling="2026-04-17 14:24:58.875400502 +0000 UTC m=+254.011408031" observedRunningTime="2026-04-17 14:25:00.095187897 +0000 UTC m=+255.231195445" watchObservedRunningTime="2026-04-17 14:25:00.096347591 +0000 UTC m=+255.232355139" Apr 17 14:25:23.815530 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:25:23.815478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" podUID="4a326186-1f33-45e3-bf03-b51a1846d9da" Apr 17 14:25:24.133631 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:24.133604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:25:24.817363 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:25:24.817316 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dxkfs" podUID="da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e" Apr 17 14:25:24.817726 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:25:24.817363 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wf85s" podUID="059af305-b8f4-4631-aba4-e42fe75f0259" Apr 17 14:25:25.135703 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:25.135673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wf85s" Apr 17 14:25:25.135883 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:25.135673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:25:27.730016 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.729987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:25:27.732517 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.732483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4a326186-1f33-45e3-bf03-b51a1846d9da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ch6nv\" (UID: \"4a326186-1f33-45e3-bf03-b51a1846d9da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:25:27.737062 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.737042 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-h8hrw\"" Apr 17 14:25:27.744380 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.744366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" Apr 17 14:25:27.830825 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.830789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:25:27.833320 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.833297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e-cert\") pod \"ingress-canary-dxkfs\" (UID: \"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e\") " pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:25:27.839869 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.839844 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w9wr2\"" Apr 17 14:25:27.847084 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.847059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dxkfs" Apr 17 14:25:27.858758 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.858736 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv"] Apr 17 14:25:27.861325 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:25:27.861301 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a326186_1f33_45e3_bf03_b51a1846d9da.slice/crio-ea344df3bc7926876ec57dc665257c2f3e9d687b032eca64cbc09f30c5621846 WatchSource:0}: Error finding container ea344df3bc7926876ec57dc665257c2f3e9d687b032eca64cbc09f30c5621846: Status 404 returned error can't find the container with id ea344df3bc7926876ec57dc665257c2f3e9d687b032eca64cbc09f30c5621846 Apr 17 14:25:27.932256 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.932225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:25:27.934627 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.934602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/059af305-b8f4-4631-aba4-e42fe75f0259-metrics-tls\") pod \"dns-default-wf85s\" (UID: \"059af305-b8f4-4631-aba4-e42fe75f0259\") " pod="openshift-dns/dns-default-wf85s" Apr 17 14:25:27.957784 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:27.957759 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dxkfs"] Apr 17 14:25:27.960418 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:25:27.960392 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda67d3ea_dc78_4d70_8ed3_1bcb7edf4d9e.slice/crio-d36b0869fc9e5941290282e0569fe8cbc540681722dd1f279a0b1a24fe34c478 WatchSource:0}: Error finding container d36b0869fc9e5941290282e0569fe8cbc540681722dd1f279a0b1a24fe34c478: Status 404 returned error can't find the container with id d36b0869fc9e5941290282e0569fe8cbc540681722dd1f279a0b1a24fe34c478 Apr 17 14:25:28.140128 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:28.140104 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rkhth\"" Apr 17 14:25:28.147112 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:28.147084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wf85s" Apr 17 14:25:28.147962 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:28.147855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dxkfs" event={"ID":"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e","Type":"ContainerStarted","Data":"d36b0869fc9e5941290282e0569fe8cbc540681722dd1f279a0b1a24fe34c478"} Apr 17 14:25:28.148971 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:28.148946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" event={"ID":"4a326186-1f33-45e3-bf03-b51a1846d9da","Type":"ContainerStarted","Data":"ea344df3bc7926876ec57dc665257c2f3e9d687b032eca64cbc09f30c5621846"} Apr 17 14:25:28.261610 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:28.261579 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wf85s"] Apr 17 14:25:28.264349 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:25:28.264312 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059af305_b8f4_4631_aba4_e42fe75f0259.slice/crio-8edfc219b60bb645cf7aa0887d6d2d223eb13859af4d9f42b2eb4ee1b92dd206 WatchSource:0}: Error finding container 8edfc219b60bb645cf7aa0887d6d2d223eb13859af4d9f42b2eb4ee1b92dd206: Status 404 returned error can't find the container with id 8edfc219b60bb645cf7aa0887d6d2d223eb13859af4d9f42b2eb4ee1b92dd206 Apr 17 14:25:29.154719 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:29.154682 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" event={"ID":"4a326186-1f33-45e3-bf03-b51a1846d9da","Type":"ContainerStarted","Data":"20322f9d47f6f6c3919ee9b36892d4381ab478b0ff1ed873ae77ca0b0a5d8b53"} Apr 17 14:25:29.156030 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:29.155998 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wf85s" event={"ID":"059af305-b8f4-4631-aba4-e42fe75f0259","Type":"ContainerStarted","Data":"8edfc219b60bb645cf7aa0887d6d2d223eb13859af4d9f42b2eb4ee1b92dd206"} Apr 17 14:25:29.173576 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:29.173526 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ch6nv" podStartSLOduration=261.215342493 podStartE2EDuration="4m22.173509605s" podCreationTimestamp="2026-04-17 14:21:07 +0000 UTC" firstStartedPulling="2026-04-17 14:25:27.863590457 +0000 UTC m=+282.999597988" lastFinishedPulling="2026-04-17 14:25:28.821757571 +0000 UTC m=+283.957765100" observedRunningTime="2026-04-17 14:25:29.171207167 +0000 UTC m=+284.307214727" watchObservedRunningTime="2026-04-17 14:25:29.173509605 +0000 UTC m=+284.309517155" Apr 17 14:25:30.160733 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:30.160704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dxkfs" event={"ID":"da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e","Type":"ContainerStarted","Data":"d70db4d24e9c6529bf4d47fb05fd1c2a1fe0da73f33cc8ed4f0f56e29f682055"} Apr 17 14:25:30.178365 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:30.178215 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dxkfs" podStartSLOduration=251.048438399 podStartE2EDuration="4m13.178195825s" podCreationTimestamp="2026-04-17 14:21:17 +0000 UTC" firstStartedPulling="2026-04-17 14:25:27.96236381 +0000 UTC m=+283.098371336" lastFinishedPulling="2026-04-17 14:25:30.092121223 +0000 UTC m=+285.228128762" observedRunningTime="2026-04-17 14:25:30.177866559 +0000 UTC m=+285.313874106" watchObservedRunningTime="2026-04-17 14:25:30.178195825 +0000 UTC m=+285.314203376" Apr 17 14:25:31.165030 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:31.164995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wf85s" event={"ID":"059af305-b8f4-4631-aba4-e42fe75f0259","Type":"ContainerStarted","Data":"536157884b35d2a01dc378501b731ed3fc8f81ca29113f88b0f3e086d2694662"} Apr 17 14:25:31.165030 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:31.165033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wf85s" event={"ID":"059af305-b8f4-4631-aba4-e42fe75f0259","Type":"ContainerStarted","Data":"22b875cef81a7269f8b351c2c89406a2ad72337731a256840fc66e7357e0f528"} Apr 17 14:25:31.183052 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:31.183010 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wf85s" podStartSLOduration=252.355269981 podStartE2EDuration="4m14.182997216s" podCreationTimestamp="2026-04-17 14:21:17 +0000 UTC" firstStartedPulling="2026-04-17 14:25:28.266143421 +0000 UTC m=+283.402150947" lastFinishedPulling="2026-04-17 14:25:30.093870652 +0000 UTC m=+285.229878182" observedRunningTime="2026-04-17 14:25:31.181557094 +0000 UTC m=+286.317564652" watchObservedRunningTime="2026-04-17 14:25:31.182997216 +0000 UTC m=+286.319004763" Apr 17 14:25:32.167155 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:32.167123 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wf85s" Apr 17 14:25:42.171367 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:42.171335 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wf85s" Apr 17 14:25:45.281292 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:25:45.281251 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:27:37.271996 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.271952 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx"] Apr 17 14:27:37.274772 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.274754 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.277534 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.277509 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:27:37.277656 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.277620 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:27:37.278953 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.278939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dzmq6\"" Apr 17 14:27:37.284794 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.284772 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx"] Apr 17 14:27:37.428129 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.428090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.428129 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.428130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.428362 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.428166 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4x7z\" (UniqueName: \"kubernetes.io/projected/1b0b6f18-4414-44f7-a36d-629f45d1a710-kube-api-access-d4x7z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.529105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.529014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.529105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.529051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.529105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.529088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4x7z\" (UniqueName: \"kubernetes.io/projected/1b0b6f18-4414-44f7-a36d-629f45d1a710-kube-api-access-d4x7z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.529921 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.529898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.530011 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.529933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.538006 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.537981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4x7z\" (UniqueName: \"kubernetes.io/projected/1b0b6f18-4414-44f7-a36d-629f45d1a710-kube-api-access-d4x7z\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.583003 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.582979 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:37.695866 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.695843 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx"] Apr 17 14:27:37.697958 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:27:37.697918 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b0b6f18_4414_44f7_a36d_629f45d1a710.slice/crio-a2bf5498871eef438d12a007aed97ec9b7f004990e42d41a53c1c1cf963e3c57 WatchSource:0}: Error finding container a2bf5498871eef438d12a007aed97ec9b7f004990e42d41a53c1c1cf963e3c57: Status 404 returned error can't find the container with id a2bf5498871eef438d12a007aed97ec9b7f004990e42d41a53c1c1cf963e3c57 Apr 17 14:27:37.699740 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:37.699721 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:27:38.484383 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:38.484179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" event={"ID":"1b0b6f18-4414-44f7-a36d-629f45d1a710","Type":"ContainerStarted","Data":"a2bf5498871eef438d12a007aed97ec9b7f004990e42d41a53c1c1cf963e3c57"} Apr 17 14:27:44.500753 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:44.500722 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerID="ac33a51c5d2abce2746cd804339056b34111669c64aff176ecd380e5a29cc1c3" exitCode=0 Apr 17 14:27:44.501105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:44.500806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" event={"ID":"1b0b6f18-4414-44f7-a36d-629f45d1a710","Type":"ContainerDied","Data":"ac33a51c5d2abce2746cd804339056b34111669c64aff176ecd380e5a29cc1c3"} Apr 17 14:27:46.506989 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:46.506958 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerID="7971e7bd10df915e03c79dcc53daaa82b2243c1061268a64d53d67c338abbd87" exitCode=0 Apr 17 14:27:46.507380 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:46.507015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" event={"ID":"1b0b6f18-4414-44f7-a36d-629f45d1a710","Type":"ContainerDied","Data":"7971e7bd10df915e03c79dcc53daaa82b2243c1061268a64d53d67c338abbd87"} Apr 17 14:27:55.531631 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:55.531594 2577 generic.go:358] "Generic (PLEG): container finished" podID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerID="8e99d35c4fe13870bcb92ccd395a3cb98abb873db0e229dcff94317c85f96ab2" exitCode=0 Apr 17 14:27:55.532018 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:55.531646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" event={"ID":"1b0b6f18-4414-44f7-a36d-629f45d1a710","Type":"ContainerDied","Data":"8e99d35c4fe13870bcb92ccd395a3cb98abb873db0e229dcff94317c85f96ab2"} Apr 17 14:27:56.651969 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.651943 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:27:56.775598 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.775569 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4x7z\" (UniqueName: \"kubernetes.io/projected/1b0b6f18-4414-44f7-a36d-629f45d1a710-kube-api-access-d4x7z\") pod \"1b0b6f18-4414-44f7-a36d-629f45d1a710\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " Apr 17 14:27:56.775760 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.775617 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-bundle\") pod \"1b0b6f18-4414-44f7-a36d-629f45d1a710\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " Apr 17 14:27:56.775760 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.775653 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-util\") pod \"1b0b6f18-4414-44f7-a36d-629f45d1a710\" (UID: \"1b0b6f18-4414-44f7-a36d-629f45d1a710\") " Apr 17 14:27:56.776219 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.776193 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-bundle" (OuterVolumeSpecName: "bundle") pod "1b0b6f18-4414-44f7-a36d-629f45d1a710" (UID: "1b0b6f18-4414-44f7-a36d-629f45d1a710"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:27:56.777616 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.777594 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0b6f18-4414-44f7-a36d-629f45d1a710-kube-api-access-d4x7z" (OuterVolumeSpecName: "kube-api-access-d4x7z") pod "1b0b6f18-4414-44f7-a36d-629f45d1a710" (UID: "1b0b6f18-4414-44f7-a36d-629f45d1a710"). InnerVolumeSpecName "kube-api-access-d4x7z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:27:56.781776 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.781736 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-util" (OuterVolumeSpecName: "util") pod "1b0b6f18-4414-44f7-a36d-629f45d1a710" (UID: "1b0b6f18-4414-44f7-a36d-629f45d1a710"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:27:56.876099 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.876065 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4x7z\" (UniqueName: \"kubernetes.io/projected/1b0b6f18-4414-44f7-a36d-629f45d1a710-kube-api-access-d4x7z\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:27:56.876099 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.876098 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:27:56.876297 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:56.876112 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b0b6f18-4414-44f7-a36d-629f45d1a710-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:27:57.538118 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:57.538051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" event={"ID":"1b0b6f18-4414-44f7-a36d-629f45d1a710","Type":"ContainerDied","Data":"a2bf5498871eef438d12a007aed97ec9b7f004990e42d41a53c1c1cf963e3c57"} Apr 17 14:27:57.538118 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:57.538083 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2bf5498871eef438d12a007aed97ec9b7f004990e42d41a53c1c1cf963e3c57" Apr 17 14:27:57.538118 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:27:57.538085 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e597vgx" Apr 17 14:28:11.946180 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946142 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8"] Apr 17 14:28:11.946668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946376 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerName="util" Apr 17 14:28:11.946668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946387 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerName="util" Apr 17 14:28:11.946668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946396 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerName="pull" Apr 17 14:28:11.946668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946401 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerName="pull" Apr 17 14:28:11.946668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946417 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerName="extract" Apr 17 14:28:11.946668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946423 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerName="extract" Apr 17 14:28:11.946668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.946460 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b0b6f18-4414-44f7-a36d-629f45d1a710" containerName="extract" Apr 17 14:28:11.952536 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.952514 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:11.956174 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.956145 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dzmq6\"" Apr 17 14:28:11.956174 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.956147 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:28:11.956393 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.956189 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:28:11.956393 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:11.956185 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8"] Apr 17 14:28:12.081412 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.081376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxks5\" (UniqueName: \"kubernetes.io/projected/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-kube-api-access-gxks5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.081586 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.081423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.081586 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.081510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.182674 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.182646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.182819 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.182693 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxks5\" (UniqueName: \"kubernetes.io/projected/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-kube-api-access-gxks5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.182819 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.182719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.183012 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.182992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.183069 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.183047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.191038 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.191022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxks5\" (UniqueName: \"kubernetes.io/projected/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-kube-api-access-gxks5\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.261930 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.261867 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:12.375936 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.375904 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8"] Apr 17 14:28:12.379350 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:28:12.379324 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2dfbcd6_fd52_4e7f_8658_aad4321d37e4.slice/crio-586ddcdcda48f8d47cb1a54f39f4dc0358e1153cd23d522cdf0df60370e47407 WatchSource:0}: Error finding container 586ddcdcda48f8d47cb1a54f39f4dc0358e1153cd23d522cdf0df60370e47407: Status 404 returned error can't find the container with id 586ddcdcda48f8d47cb1a54f39f4dc0358e1153cd23d522cdf0df60370e47407 Apr 17 14:28:12.576787 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.576750 2577 generic.go:358] "Generic (PLEG): container finished" podID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerID="5452950382fb8bc1677e8158f9f63d2c98bc2c7e37ee1bd469c919e0831b6c35" exitCode=0 Apr 17 14:28:12.576925 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.576819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" event={"ID":"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4","Type":"ContainerDied","Data":"5452950382fb8bc1677e8158f9f63d2c98bc2c7e37ee1bd469c919e0831b6c35"} Apr 17 14:28:12.576925 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:12.576847 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" event={"ID":"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4","Type":"ContainerStarted","Data":"586ddcdcda48f8d47cb1a54f39f4dc0358e1153cd23d522cdf0df60370e47407"} Apr 17 14:28:19.596136 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:19.596102 2577 generic.go:358] "Generic (PLEG): container finished" podID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerID="06cca52cf62b2f13a1d2c56c02b50f6fdbb5d78e64f387b644769f4d16252e6f" exitCode=0 Apr 17 14:28:19.596486 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:19.596148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" event={"ID":"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4","Type":"ContainerDied","Data":"06cca52cf62b2f13a1d2c56c02b50f6fdbb5d78e64f387b644769f4d16252e6f"} Apr 17 14:28:20.600031 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:20.599984 2577 generic.go:358] "Generic (PLEG): container finished" podID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerID="100a5f3997408b6f6552219b70ea2878dbb7e932b7e1c1f72ef97cebaf9ef40f" exitCode=0 Apr 17 14:28:20.600402 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:20.600077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" event={"ID":"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4","Type":"ContainerDied","Data":"100a5f3997408b6f6552219b70ea2878dbb7e932b7e1c1f72ef97cebaf9ef40f"} Apr 17 14:28:21.714387 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.714366 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:21.742684 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.742657 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-bundle\") pod \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " Apr 17 14:28:21.742830 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.742713 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxks5\" (UniqueName: \"kubernetes.io/projected/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-kube-api-access-gxks5\") pod \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " Apr 17 14:28:21.742830 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.742755 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-util\") pod \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\" (UID: \"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4\") " Apr 17 14:28:21.743096 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.743067 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-bundle" (OuterVolumeSpecName: "bundle") pod "a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" (UID: "a2dfbcd6-fd52-4e7f-8658-aad4321d37e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:28:21.744724 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.744698 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-kube-api-access-gxks5" (OuterVolumeSpecName: "kube-api-access-gxks5") pod "a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" (UID: "a2dfbcd6-fd52-4e7f-8658-aad4321d37e4"). InnerVolumeSpecName "kube-api-access-gxks5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:28:21.747194 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.747172 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-util" (OuterVolumeSpecName: "util") pod "a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" (UID: "a2dfbcd6-fd52-4e7f-8658-aad4321d37e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:28:21.843404 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.843373 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:21.843404 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.843397 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:21.843404 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:21.843406 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxks5\" (UniqueName: \"kubernetes.io/projected/a2dfbcd6-fd52-4e7f-8658-aad4321d37e4-kube-api-access-gxks5\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:22.356213 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356182 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-qprvh"] Apr 17 14:28:22.356440 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356427 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerName="extract" Apr 17 14:28:22.356493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356442 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerName="extract" Apr 17 14:28:22.356493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356452 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerName="pull" Apr 17 14:28:22.356493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356458 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerName="pull" Apr 17 14:28:22.356493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356465 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerName="util" Apr 17 14:28:22.356493 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356474 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerName="util" Apr 17 14:28:22.356637 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.356516 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2dfbcd6-fd52-4e7f-8658-aad4321d37e4" containerName="extract" Apr 17 14:28:22.363497 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.363469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.366187 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.366163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-qprvh"] Apr 17 14:28:22.366333 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.366267 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:28:22.366448 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.366425 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:28:22.366569 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.366553 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-2sqnq\"" Apr 17 14:28:22.447319 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.447263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d5dc26a-c24f-42e2-a807-6e0aa525e448-bound-sa-token\") pod \"cert-manager-759f64656b-qprvh\" (UID: \"3d5dc26a-c24f-42e2-a807-6e0aa525e448\") " pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.447319 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.447316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47v42\" (UniqueName: \"kubernetes.io/projected/3d5dc26a-c24f-42e2-a807-6e0aa525e448-kube-api-access-47v42\") pod \"cert-manager-759f64656b-qprvh\" (UID: \"3d5dc26a-c24f-42e2-a807-6e0aa525e448\") " pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.547673 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.547641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d5dc26a-c24f-42e2-a807-6e0aa525e448-bound-sa-token\") pod \"cert-manager-759f64656b-qprvh\" (UID: \"3d5dc26a-c24f-42e2-a807-6e0aa525e448\") " pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.547673 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.547673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47v42\" (UniqueName: \"kubernetes.io/projected/3d5dc26a-c24f-42e2-a807-6e0aa525e448-kube-api-access-47v42\") pod \"cert-manager-759f64656b-qprvh\" (UID: \"3d5dc26a-c24f-42e2-a807-6e0aa525e448\") " pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.556574 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.556543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d5dc26a-c24f-42e2-a807-6e0aa525e448-bound-sa-token\") pod \"cert-manager-759f64656b-qprvh\" (UID: \"3d5dc26a-c24f-42e2-a807-6e0aa525e448\") " pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.556672 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.556660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47v42\" (UniqueName: \"kubernetes.io/projected/3d5dc26a-c24f-42e2-a807-6e0aa525e448-kube-api-access-47v42\") pod \"cert-manager-759f64656b-qprvh\" (UID: \"3d5dc26a-c24f-42e2-a807-6e0aa525e448\") " pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.606464 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.606403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" event={"ID":"a2dfbcd6-fd52-4e7f-8658-aad4321d37e4","Type":"ContainerDied","Data":"586ddcdcda48f8d47cb1a54f39f4dc0358e1153cd23d522cdf0df60370e47407"} Apr 17 14:28:22.606464 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.606436 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="586ddcdcda48f8d47cb1a54f39f4dc0358e1153cd23d522cdf0df60370e47407" Apr 17 14:28:22.606464 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.606416 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ft9wm8" Apr 17 14:28:22.672927 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.672896 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-qprvh" Apr 17 14:28:22.784638 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:22.784615 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-qprvh"] Apr 17 14:28:22.786810 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:28:22.786774 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d5dc26a_c24f_42e2_a807_6e0aa525e448.slice/crio-480a62816fe25fc1d37a9c8fbaf49ccd56e92f45c34e2c4fe35427680b3fc302 WatchSource:0}: Error finding container 480a62816fe25fc1d37a9c8fbaf49ccd56e92f45c34e2c4fe35427680b3fc302: Status 404 returned error can't find the container with id 480a62816fe25fc1d37a9c8fbaf49ccd56e92f45c34e2c4fe35427680b3fc302 Apr 17 14:28:23.610743 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:23.610711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-qprvh" event={"ID":"3d5dc26a-c24f-42e2-a807-6e0aa525e448","Type":"ContainerStarted","Data":"480a62816fe25fc1d37a9c8fbaf49ccd56e92f45c34e2c4fe35427680b3fc302"} Apr 17 14:28:25.621965 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:25.621924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-qprvh" event={"ID":"3d5dc26a-c24f-42e2-a807-6e0aa525e448","Type":"ContainerStarted","Data":"fc91397334781c216c5e7172af8bcb7fac652d8323451280b22db4d94efc2932"} Apr 17 14:28:25.638901 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:25.638806 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-qprvh" podStartSLOduration=1.053725598 podStartE2EDuration="3.638793533s" podCreationTimestamp="2026-04-17 14:28:22 +0000 UTC" firstStartedPulling="2026-04-17 14:28:22.788717364 +0000 UTC m=+457.924724890" lastFinishedPulling="2026-04-17 14:28:25.373785299 +0000 UTC m=+460.509792825" observedRunningTime="2026-04-17 14:28:25.637606788 +0000 UTC m=+460.773614336" watchObservedRunningTime="2026-04-17 14:28:25.638793533 +0000 UTC m=+460.774801080" Apr 17 14:28:31.272854 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.272820 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg"] Apr 17 14:28:31.276372 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.276354 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.279742 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.279723 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:28:31.280925 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.280907 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:28:31.280925 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.280916 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dzmq6\"" Apr 17 14:28:31.284175 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.284154 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg"] Apr 17 14:28:31.306511 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.306488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf6s4\" (UniqueName: \"kubernetes.io/projected/cc64020b-62f4-43dd-981a-bad35648c6a8-kube-api-access-tf6s4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.306630 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.306544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.306630 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.306611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.406954 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.406920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.406954 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.406959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.407137 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.407000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf6s4\" (UniqueName: \"kubernetes.io/projected/cc64020b-62f4-43dd-981a-bad35648c6a8-kube-api-access-tf6s4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.407254 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.407236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.407366 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.407350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.415345 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.415325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf6s4\" (UniqueName: \"kubernetes.io/projected/cc64020b-62f4-43dd-981a-bad35648c6a8-kube-api-access-tf6s4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.585373 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.585345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:31.697721 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:31.697695 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg"] Apr 17 14:28:31.700229 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:28:31.700199 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc64020b_62f4_43dd_981a_bad35648c6a8.slice/crio-12035d0be15b93bdb5ae785c3e1f3013c28e0d8a5b1d062212aa8798a6c6e1c6 WatchSource:0}: Error finding container 12035d0be15b93bdb5ae785c3e1f3013c28e0d8a5b1d062212aa8798a6c6e1c6: Status 404 returned error can't find the container with id 12035d0be15b93bdb5ae785c3e1f3013c28e0d8a5b1d062212aa8798a6c6e1c6 Apr 17 14:28:32.641862 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:32.641828 2577 generic.go:358] "Generic (PLEG): container finished" podID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerID="48ebe5db336b9e71a4268c0133e6ab57b459541b9e5e7cceddc1c64f35ff7fea" exitCode=0 Apr 17 14:28:32.641862 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:32.641867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" event={"ID":"cc64020b-62f4-43dd-981a-bad35648c6a8","Type":"ContainerDied","Data":"48ebe5db336b9e71a4268c0133e6ab57b459541b9e5e7cceddc1c64f35ff7fea"} Apr 17 14:28:32.642367 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:32.641889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" event={"ID":"cc64020b-62f4-43dd-981a-bad35648c6a8","Type":"ContainerStarted","Data":"12035d0be15b93bdb5ae785c3e1f3013c28e0d8a5b1d062212aa8798a6c6e1c6"} Apr 17 14:28:33.646064 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:33.646033 2577 generic.go:358] "Generic (PLEG): container finished" podID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerID="3b33e42961f558d9648dad0717b9bde854946f427b4938c643646e6366a75a38" exitCode=0 Apr 17 14:28:33.646476 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:33.646117 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" event={"ID":"cc64020b-62f4-43dd-981a-bad35648c6a8","Type":"ContainerDied","Data":"3b33e42961f558d9648dad0717b9bde854946f427b4938c643646e6366a75a38"} Apr 17 14:28:34.650312 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:34.650258 2577 generic.go:358] "Generic (PLEG): container finished" podID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerID="9b39816811ebeb107078f5aad00f8b2c9083a7f76a7bbb90284498fa369cdda5" exitCode=0 Apr 17 14:28:34.650689 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:34.650345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" event={"ID":"cc64020b-62f4-43dd-981a-bad35648c6a8","Type":"ContainerDied","Data":"9b39816811ebeb107078f5aad00f8b2c9083a7f76a7bbb90284498fa369cdda5"} Apr 17 14:28:35.764734 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.764711 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:35.838850 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.838810 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-bundle\") pod \"cc64020b-62f4-43dd-981a-bad35648c6a8\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " Apr 17 14:28:35.839021 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.838901 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf6s4\" (UniqueName: \"kubernetes.io/projected/cc64020b-62f4-43dd-981a-bad35648c6a8-kube-api-access-tf6s4\") pod \"cc64020b-62f4-43dd-981a-bad35648c6a8\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " Apr 17 14:28:35.839021 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.838946 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-util\") pod \"cc64020b-62f4-43dd-981a-bad35648c6a8\" (UID: \"cc64020b-62f4-43dd-981a-bad35648c6a8\") " Apr 17 14:28:35.839520 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.839494 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-bundle" (OuterVolumeSpecName: "bundle") pod "cc64020b-62f4-43dd-981a-bad35648c6a8" (UID: "cc64020b-62f4-43dd-981a-bad35648c6a8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:28:35.840836 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.840816 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc64020b-62f4-43dd-981a-bad35648c6a8-kube-api-access-tf6s4" (OuterVolumeSpecName: "kube-api-access-tf6s4") pod "cc64020b-62f4-43dd-981a-bad35648c6a8" (UID: "cc64020b-62f4-43dd-981a-bad35648c6a8"). InnerVolumeSpecName "kube-api-access-tf6s4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:28:35.844173 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.844134 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-util" (OuterVolumeSpecName: "util") pod "cc64020b-62f4-43dd-981a-bad35648c6a8" (UID: "cc64020b-62f4-43dd-981a-bad35648c6a8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:28:35.940373 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.940298 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:35.940373 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.940335 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tf6s4\" (UniqueName: \"kubernetes.io/projected/cc64020b-62f4-43dd-981a-bad35648c6a8-kube-api-access-tf6s4\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:35.940373 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:35.940346 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cc64020b-62f4-43dd-981a-bad35648c6a8-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:36.656315 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:36.656260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" event={"ID":"cc64020b-62f4-43dd-981a-bad35648c6a8","Type":"ContainerDied","Data":"12035d0be15b93bdb5ae785c3e1f3013c28e0d8a5b1d062212aa8798a6c6e1c6"} Apr 17 14:28:36.656315 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:36.656313 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12035d0be15b93bdb5ae785c3e1f3013c28e0d8a5b1d062212aa8798a6c6e1c6" Apr 17 14:28:36.656315 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:36.656294 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5hfrdg" Apr 17 14:28:48.010025 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.009991 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn"] Apr 17 14:28:48.010508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.010243 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerName="util" Apr 17 14:28:48.010508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.010258 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerName="util" Apr 17 14:28:48.010508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.010283 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerName="pull" Apr 17 14:28:48.010508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.010289 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerName="pull" Apr 17 14:28:48.010508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.010304 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerName="extract" Apr 17 14:28:48.010508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.010311 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerName="extract" Apr 17 14:28:48.010508 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.010367 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc64020b-62f4-43dd-981a-bad35648c6a8" containerName="extract" Apr 17 14:28:48.014362 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.014338 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.017166 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.017141 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:28:48.017316 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.017183 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:28:48.017316 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.017266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 14:28:48.017473 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.017459 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 14:28:48.017540 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.017478 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-268bl\"" Apr 17 14:28:48.028139 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.028121 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn"] Apr 17 14:28:48.132190 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.132159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f10dd800-e357-419d-9b22-147b74e0bc47-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.132361 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.132200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f10dd800-e357-419d-9b22-147b74e0bc47-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.132361 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.132259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktkqx\" (UniqueName: \"kubernetes.io/projected/f10dd800-e357-419d-9b22-147b74e0bc47-kube-api-access-ktkqx\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.232754 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.232721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktkqx\" (UniqueName: \"kubernetes.io/projected/f10dd800-e357-419d-9b22-147b74e0bc47-kube-api-access-ktkqx\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.232904 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.232763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f10dd800-e357-419d-9b22-147b74e0bc47-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.232904 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.232789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f10dd800-e357-419d-9b22-147b74e0bc47-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.235099 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.235074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f10dd800-e357-419d-9b22-147b74e0bc47-apiservice-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.235231 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.235209 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f10dd800-e357-419d-9b22-147b74e0bc47-webhook-cert\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.247974 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.247951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktkqx\" (UniqueName: \"kubernetes.io/projected/f10dd800-e357-419d-9b22-147b74e0bc47-kube-api-access-ktkqx\") pod \"opendatahub-operator-controller-manager-58c8f88b6d-78vxn\" (UID: \"f10dd800-e357-419d-9b22-147b74e0bc47\") " pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.293261 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.293201 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq"] Apr 17 14:28:48.296216 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.296202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.299063 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.299036 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dzmq6\"" Apr 17 14:28:48.299193 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.299081 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:28:48.299245 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.299231 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:28:48.304978 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.304948 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq"] Apr 17 14:28:48.323867 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.323848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:48.333725 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.333664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h4ll\" (UniqueName: \"kubernetes.io/projected/2e29063c-a313-4005-9d85-2ac364bca5bc-kube-api-access-6h4ll\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.333831 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.333759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.333899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.333864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.434792 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.434759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.434941 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.434811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h4ll\" (UniqueName: \"kubernetes.io/projected/2e29063c-a313-4005-9d85-2ac364bca5bc-kube-api-access-6h4ll\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.434941 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.434844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.435138 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.435117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.435196 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.435149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.446105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.446072 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn"] Apr 17 14:28:48.448542 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.448522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h4ll\" (UniqueName: \"kubernetes.io/projected/2e29063c-a313-4005-9d85-2ac364bca5bc-kube-api-access-6h4ll\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.450008 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:28:48.449985 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf10dd800_e357_419d_9b22_147b74e0bc47.slice/crio-8f01ca59cbad7206a0b51b19d82bb5304868f6c337dd00b8d5875c642dd7806c WatchSource:0}: Error finding container 8f01ca59cbad7206a0b51b19d82bb5304868f6c337dd00b8d5875c642dd7806c: Status 404 returned error can't find the container with id 8f01ca59cbad7206a0b51b19d82bb5304868f6c337dd00b8d5875c642dd7806c Apr 17 14:28:48.605305 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.605262 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:48.692310 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.692260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" event={"ID":"f10dd800-e357-419d-9b22-147b74e0bc47","Type":"ContainerStarted","Data":"8f01ca59cbad7206a0b51b19d82bb5304868f6c337dd00b8d5875c642dd7806c"} Apr 17 14:28:48.721516 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:48.721494 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq"] Apr 17 14:28:48.723552 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:28:48.723531 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e29063c_a313_4005_9d85_2ac364bca5bc.slice/crio-67b04ca8cc05e4693836a8a14253f0e2a57f9d2e1e8853a4f6dc611936794d94 WatchSource:0}: Error finding container 67b04ca8cc05e4693836a8a14253f0e2a57f9d2e1e8853a4f6dc611936794d94: Status 404 returned error can't find the container with id 67b04ca8cc05e4693836a8a14253f0e2a57f9d2e1e8853a4f6dc611936794d94 Apr 17 14:28:49.697074 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.697034 2577 generic.go:358] "Generic (PLEG): container finished" podID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerID="a3c8cabee2981d84f19ade067778727e927121d2c29ab13b55c1c9f5ba6231a7" exitCode=0 Apr 17 14:28:49.697527 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.697110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" event={"ID":"2e29063c-a313-4005-9d85-2ac364bca5bc","Type":"ContainerDied","Data":"a3c8cabee2981d84f19ade067778727e927121d2c29ab13b55c1c9f5ba6231a7"} Apr 17 14:28:49.697527 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.697139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" event={"ID":"2e29063c-a313-4005-9d85-2ac364bca5bc","Type":"ContainerStarted","Data":"67b04ca8cc05e4693836a8a14253f0e2a57f9d2e1e8853a4f6dc611936794d94"} Apr 17 14:28:49.734432 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.734367 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45"] Apr 17 14:28:49.741328 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.741298 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.744913 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.744881 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:28:49.744913 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.744903 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 14:28:49.745105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.745010 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 14:28:49.745154 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.745145 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 14:28:49.745243 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.745226 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pml4c\"" Apr 17 14:28:49.745333 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.745235 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45"] Apr 17 14:28:49.745333 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.745241 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 14:28:49.847795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.847765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-cert\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.847972 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.847843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-metrics-cert\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.847972 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.847877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69lzp\" (UniqueName: \"kubernetes.io/projected/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-kube-api-access-69lzp\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.847972 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.847914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-manager-config\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.949194 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.949106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-metrics-cert\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.949194 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.949169 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69lzp\" (UniqueName: \"kubernetes.io/projected/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-kube-api-access-69lzp\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.949432 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.949208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-manager-config\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.949432 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.949249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-cert\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.950000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.949969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-manager-config\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.952000 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.951976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-cert\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.952318 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.952291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-metrics-cert\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:49.960228 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:49.960207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69lzp\" (UniqueName: \"kubernetes.io/projected/dbb47d3a-1bc6-4625-8bca-9418a2f18d10-kube-api-access-69lzp\") pod \"lws-controller-manager-5b89f4cf56-s7c45\" (UID: \"dbb47d3a-1bc6-4625-8bca-9418a2f18d10\") " pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:50.053028 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:50.052988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:51.031500 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:51.031365 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45"] Apr 17 14:28:51.086054 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:28:51.086021 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbb47d3a_1bc6_4625_8bca_9418a2f18d10.slice/crio-7c657407eeca8b507f52dab59118d8913e1c07949541d0a97972a255987d60f0 WatchSource:0}: Error finding container 7c657407eeca8b507f52dab59118d8913e1c07949541d0a97972a255987d60f0: Status 404 returned error can't find the container with id 7c657407eeca8b507f52dab59118d8913e1c07949541d0a97972a255987d60f0 Apr 17 14:28:51.705042 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:51.705012 2577 generic.go:358] "Generic (PLEG): container finished" podID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerID="fee31c471c185b4b1f14054b412d6b4b526f02fe1750a0b7f8ca5a19f09d5772" exitCode=0 Apr 17 14:28:51.705183 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:51.705098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" event={"ID":"2e29063c-a313-4005-9d85-2ac364bca5bc","Type":"ContainerDied","Data":"fee31c471c185b4b1f14054b412d6b4b526f02fe1750a0b7f8ca5a19f09d5772"} Apr 17 14:28:51.706581 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:51.706562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" event={"ID":"f10dd800-e357-419d-9b22-147b74e0bc47","Type":"ContainerStarted","Data":"7f396aba815d2a6240007ba2146f7439ce730961ea25fe1b62e5d9bbed63da79"} Apr 17 14:28:51.706708 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:51.706698 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:28:51.707575 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:51.707556 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" event={"ID":"dbb47d3a-1bc6-4625-8bca-9418a2f18d10","Type":"ContainerStarted","Data":"7c657407eeca8b507f52dab59118d8913e1c07949541d0a97972a255987d60f0"} Apr 17 14:28:51.744153 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:51.744096 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" podStartSLOduration=2.282188928 podStartE2EDuration="4.744078631s" podCreationTimestamp="2026-04-17 14:28:47 +0000 UTC" firstStartedPulling="2026-04-17 14:28:48.451541196 +0000 UTC m=+483.587548723" lastFinishedPulling="2026-04-17 14:28:50.913430896 +0000 UTC m=+486.049438426" observedRunningTime="2026-04-17 14:28:51.74272184 +0000 UTC m=+486.878729426" watchObservedRunningTime="2026-04-17 14:28:51.744078631 +0000 UTC m=+486.880086260" Apr 17 14:28:52.712584 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:52.712548 2577 generic.go:358] "Generic (PLEG): container finished" podID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerID="b452b8a83f2fb9d85a6c576d3ab5c6e2d6f9e0e33c84079102b2a56ed9ad7937" exitCode=0 Apr 17 14:28:52.713040 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:52.712631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" event={"ID":"2e29063c-a313-4005-9d85-2ac364bca5bc","Type":"ContainerDied","Data":"b452b8a83f2fb9d85a6c576d3ab5c6e2d6f9e0e33c84079102b2a56ed9ad7937"} Apr 17 14:28:53.834037 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.834010 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:53.880816 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.880787 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-bundle\") pod \"2e29063c-a313-4005-9d85-2ac364bca5bc\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " Apr 17 14:28:53.880816 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.880819 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-util\") pod \"2e29063c-a313-4005-9d85-2ac364bca5bc\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " Apr 17 14:28:53.881043 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.880863 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h4ll\" (UniqueName: \"kubernetes.io/projected/2e29063c-a313-4005-9d85-2ac364bca5bc-kube-api-access-6h4ll\") pod \"2e29063c-a313-4005-9d85-2ac364bca5bc\" (UID: \"2e29063c-a313-4005-9d85-2ac364bca5bc\") " Apr 17 14:28:53.881737 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.881709 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-bundle" (OuterVolumeSpecName: "bundle") pod "2e29063c-a313-4005-9d85-2ac364bca5bc" (UID: "2e29063c-a313-4005-9d85-2ac364bca5bc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:28:53.882872 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.882849 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e29063c-a313-4005-9d85-2ac364bca5bc-kube-api-access-6h4ll" (OuterVolumeSpecName: "kube-api-access-6h4ll") pod "2e29063c-a313-4005-9d85-2ac364bca5bc" (UID: "2e29063c-a313-4005-9d85-2ac364bca5bc"). InnerVolumeSpecName "kube-api-access-6h4ll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:28:53.886222 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.886193 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-util" (OuterVolumeSpecName: "util") pod "2e29063c-a313-4005-9d85-2ac364bca5bc" (UID: "2e29063c-a313-4005-9d85-2ac364bca5bc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:28:53.981624 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.981560 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6h4ll\" (UniqueName: \"kubernetes.io/projected/2e29063c-a313-4005-9d85-2ac364bca5bc-kube-api-access-6h4ll\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.981624 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.981585 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:53.981624 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:53.981594 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e29063c-a313-4005-9d85-2ac364bca5bc-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:28:54.721106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:54.721068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" event={"ID":"2e29063c-a313-4005-9d85-2ac364bca5bc","Type":"ContainerDied","Data":"67b04ca8cc05e4693836a8a14253f0e2a57f9d2e1e8853a4f6dc611936794d94"} Apr 17 14:28:54.721106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:54.721092 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9g82nq" Apr 17 14:28:54.721106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:54.721110 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67b04ca8cc05e4693836a8a14253f0e2a57f9d2e1e8853a4f6dc611936794d94" Apr 17 14:28:55.725245 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:55.725162 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" event={"ID":"dbb47d3a-1bc6-4625-8bca-9418a2f18d10","Type":"ContainerStarted","Data":"6842849c94053f4447b0313e26805da0b0a3b8ada3382119a56a495887157065"} Apr 17 14:28:55.725245 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:55.725233 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:28:55.742893 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:28:55.742841 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" podStartSLOduration=2.428558497 podStartE2EDuration="6.742827704s" podCreationTimestamp="2026-04-17 14:28:49 +0000 UTC" firstStartedPulling="2026-04-17 14:28:51.087934126 +0000 UTC m=+486.223941656" lastFinishedPulling="2026-04-17 14:28:55.402203337 +0000 UTC m=+490.538210863" observedRunningTime="2026-04-17 14:28:55.740717325 +0000 UTC m=+490.876724884" watchObservedRunningTime="2026-04-17 14:28:55.742827704 +0000 UTC m=+490.878835297" Apr 17 14:29:02.714779 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:02.714749 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-58c8f88b6d-78vxn" Apr 17 14:29:06.731252 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:06.731221 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5b89f4cf56-s7c45" Apr 17 14:29:07.082336 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082308 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9"] Apr 17 14:29:07.082581 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082564 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerName="util" Apr 17 14:29:07.082627 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082581 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerName="util" Apr 17 14:29:07.082627 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082590 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerName="pull" Apr 17 14:29:07.082627 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082595 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerName="pull" Apr 17 14:29:07.082627 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082614 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerName="extract" Apr 17 14:29:07.082627 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082619 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerName="extract" Apr 17 14:29:07.082779 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.082659 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e29063c-a313-4005-9d85-2ac364bca5bc" containerName="extract" Apr 17 14:29:07.085634 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.085619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.088457 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.088421 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:29:07.088457 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.088436 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:29:07.088457 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.088429 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dzmq6\"" Apr 17 14:29:07.093046 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.093021 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9"] Apr 17 14:29:07.185209 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.185168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6m8l\" (UniqueName: \"kubernetes.io/projected/c23fa698-713b-4b8c-bd2c-f098563837eb-kube-api-access-w6m8l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.185209 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.185207 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.185485 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.185288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.286641 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.286608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6m8l\" (UniqueName: \"kubernetes.io/projected/c23fa698-713b-4b8c-bd2c-f098563837eb-kube-api-access-w6m8l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.286641 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.286644 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.286836 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.286680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.287010 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.286995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.287048 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.287017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.294704 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.294681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6m8l\" (UniqueName: \"kubernetes.io/projected/c23fa698-713b-4b8c-bd2c-f098563837eb-kube-api-access-w6m8l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.395959 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.395887 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:07.512118 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.512090 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9"] Apr 17 14:29:07.514808 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:29:07.514782 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc23fa698_713b_4b8c_bd2c_f098563837eb.slice/crio-421946e6b49b4d95e959f45bf0866f3711f3389014504446a4a40565d21328a2 WatchSource:0}: Error finding container 421946e6b49b4d95e959f45bf0866f3711f3389014504446a4a40565d21328a2: Status 404 returned error can't find the container with id 421946e6b49b4d95e959f45bf0866f3711f3389014504446a4a40565d21328a2 Apr 17 14:29:07.767852 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.767760 2577 generic.go:358] "Generic (PLEG): container finished" podID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerID="bbf215f83f6427fd8078c73c657845ae3199d601324d730cfc2cc50e2e4c6b8d" exitCode=0 Apr 17 14:29:07.768309 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.767849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" event={"ID":"c23fa698-713b-4b8c-bd2c-f098563837eb","Type":"ContainerDied","Data":"bbf215f83f6427fd8078c73c657845ae3199d601324d730cfc2cc50e2e4c6b8d"} Apr 17 14:29:07.768309 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:07.767882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" event={"ID":"c23fa698-713b-4b8c-bd2c-f098563837eb","Type":"ContainerStarted","Data":"421946e6b49b4d95e959f45bf0866f3711f3389014504446a4a40565d21328a2"} Apr 17 14:29:08.772889 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:08.772858 2577 generic.go:358] "Generic (PLEG): container finished" podID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerID="bb403a97b728b5107437d87d52a3aaf24ecf367efac6b1a3c9c0d5b0a005e757" exitCode=0 Apr 17 14:29:08.772889 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:08.772883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" event={"ID":"c23fa698-713b-4b8c-bd2c-f098563837eb","Type":"ContainerDied","Data":"bb403a97b728b5107437d87d52a3aaf24ecf367efac6b1a3c9c0d5b0a005e757"} Apr 17 14:29:09.778046 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:09.778012 2577 generic.go:358] "Generic (PLEG): container finished" podID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerID="c3e5db0b557beea521b11360a564e8e10b05c59a6b9c6a4c4aceba8aaaf3d800" exitCode=0 Apr 17 14:29:09.778418 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:09.778083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" event={"ID":"c23fa698-713b-4b8c-bd2c-f098563837eb","Type":"ContainerDied","Data":"c3e5db0b557beea521b11360a564e8e10b05c59a6b9c6a4c4aceba8aaaf3d800"} Apr 17 14:29:10.897886 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:10.897864 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:11.015301 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.015257 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-bundle\") pod \"c23fa698-713b-4b8c-bd2c-f098563837eb\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " Apr 17 14:29:11.015478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.015313 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6m8l\" (UniqueName: \"kubernetes.io/projected/c23fa698-713b-4b8c-bd2c-f098563837eb-kube-api-access-w6m8l\") pod \"c23fa698-713b-4b8c-bd2c-f098563837eb\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " Apr 17 14:29:11.015478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.015352 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-util\") pod \"c23fa698-713b-4b8c-bd2c-f098563837eb\" (UID: \"c23fa698-713b-4b8c-bd2c-f098563837eb\") " Apr 17 14:29:11.016134 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.016094 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-bundle" (OuterVolumeSpecName: "bundle") pod "c23fa698-713b-4b8c-bd2c-f098563837eb" (UID: "c23fa698-713b-4b8c-bd2c-f098563837eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:29:11.017491 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.017453 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23fa698-713b-4b8c-bd2c-f098563837eb-kube-api-access-w6m8l" (OuterVolumeSpecName: "kube-api-access-w6m8l") pod "c23fa698-713b-4b8c-bd2c-f098563837eb" (UID: "c23fa698-713b-4b8c-bd2c-f098563837eb"). InnerVolumeSpecName "kube-api-access-w6m8l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:29:11.022693 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.022665 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-util" (OuterVolumeSpecName: "util") pod "c23fa698-713b-4b8c-bd2c-f098563837eb" (UID: "c23fa698-713b-4b8c-bd2c-f098563837eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:29:11.115918 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.115882 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:29:11.115918 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.115908 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w6m8l\" (UniqueName: \"kubernetes.io/projected/c23fa698-713b-4b8c-bd2c-f098563837eb-kube-api-access-w6m8l\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:29:11.115918 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.115918 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c23fa698-713b-4b8c-bd2c-f098563837eb-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:29:11.786163 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.786074 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" Apr 17 14:29:11.786163 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.786080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835s9zr9" event={"ID":"c23fa698-713b-4b8c-bd2c-f098563837eb","Type":"ContainerDied","Data":"421946e6b49b4d95e959f45bf0866f3711f3389014504446a4a40565d21328a2"} Apr 17 14:29:11.786163 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:11.786107 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="421946e6b49b4d95e959f45bf0866f3711f3389014504446a4a40565d21328a2" Apr 17 14:29:16.547664 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.547623 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg"] Apr 17 14:29:16.548019 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.547945 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerName="extract" Apr 17 14:29:16.548019 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.547958 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerName="extract" Apr 17 14:29:16.548019 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.547984 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerName="pull" Apr 17 14:29:16.548019 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.547992 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerName="pull" Apr 17 14:29:16.548019 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.548003 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerName="util" Apr 17 14:29:16.548019 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.548010 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerName="util" Apr 17 14:29:16.548193 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.548095 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c23fa698-713b-4b8c-bd2c-f098563837eb" containerName="extract" Apr 17 14:29:16.552467 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.552448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.558318 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.558294 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 14:29:16.559504 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.559484 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 14:29:16.559623 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.559486 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-dzmq6\"" Apr 17 14:29:16.578330 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.578302 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg"] Apr 17 14:29:16.661796 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.661767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.661942 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.661801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.661942 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.661824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r6qz\" (UniqueName: \"kubernetes.io/projected/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-kube-api-access-2r6qz\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.762953 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.762920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.763100 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.762961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r6qz\" (UniqueName: \"kubernetes.io/projected/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-kube-api-access-2r6qz\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.763100 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.763011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.763308 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.763291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.763343 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.763331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.772777 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.772746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r6qz\" (UniqueName: \"kubernetes.io/projected/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-kube-api-access-2r6qz\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.861672 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.861642 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:16.991878 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:16.991854 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg"] Apr 17 14:29:16.993349 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:29:16.993321 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod471bebf2_eaee_4a25_8e44_558ac2c4c2d4.slice/crio-573aa3b2e74ca2775310caee0001e008bfe5864bdfd0fa8fed7d3fd42cbccc4d WatchSource:0}: Error finding container 573aa3b2e74ca2775310caee0001e008bfe5864bdfd0fa8fed7d3fd42cbccc4d: Status 404 returned error can't find the container with id 573aa3b2e74ca2775310caee0001e008bfe5864bdfd0fa8fed7d3fd42cbccc4d Apr 17 14:29:17.806668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:17.806635 2577 generic.go:358] "Generic (PLEG): container finished" podID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerID="4c0311316f4d7243258e03aa8dbc9bc31b948ae8148f1e4ee271b6962b285ca4" exitCode=0 Apr 17 14:29:17.806979 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:17.806698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" event={"ID":"471bebf2-eaee-4a25-8e44-558ac2c4c2d4","Type":"ContainerDied","Data":"4c0311316f4d7243258e03aa8dbc9bc31b948ae8148f1e4ee271b6962b285ca4"} Apr 17 14:29:17.806979 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:17.806720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" event={"ID":"471bebf2-eaee-4a25-8e44-558ac2c4c2d4","Type":"ContainerStarted","Data":"573aa3b2e74ca2775310caee0001e008bfe5864bdfd0fa8fed7d3fd42cbccc4d"} Apr 17 14:29:19.815254 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:19.815213 2577 generic.go:358] "Generic (PLEG): container finished" podID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerID="bcd4121fd4b98a311462435ec6a678b611cc9850e4d8257662803d5086c7947f" exitCode=0 Apr 17 14:29:19.815715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:19.815298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" event={"ID":"471bebf2-eaee-4a25-8e44-558ac2c4c2d4","Type":"ContainerDied","Data":"bcd4121fd4b98a311462435ec6a678b611cc9850e4d8257662803d5086c7947f"} Apr 17 14:29:20.820947 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:20.820912 2577 generic.go:358] "Generic (PLEG): container finished" podID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerID="aba61f73be1811e22dd8262520dfa591acaa583c312a59ee183c7613ceaf5201" exitCode=0 Apr 17 14:29:20.821318 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:20.820970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" event={"ID":"471bebf2-eaee-4a25-8e44-558ac2c4c2d4","Type":"ContainerDied","Data":"aba61f73be1811e22dd8262520dfa591acaa583c312a59ee183c7613ceaf5201"} Apr 17 14:29:21.940619 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:21.940598 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:22.101290 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.101198 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-bundle\") pod \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " Apr 17 14:29:22.101290 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.101242 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-util\") pod \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " Apr 17 14:29:22.101457 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.101301 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r6qz\" (UniqueName: \"kubernetes.io/projected/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-kube-api-access-2r6qz\") pod \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\" (UID: \"471bebf2-eaee-4a25-8e44-558ac2c4c2d4\") " Apr 17 14:29:22.102130 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.102101 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-bundle" (OuterVolumeSpecName: "bundle") pod "471bebf2-eaee-4a25-8e44-558ac2c4c2d4" (UID: "471bebf2-eaee-4a25-8e44-558ac2c4c2d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:29:22.103249 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.103227 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-kube-api-access-2r6qz" (OuterVolumeSpecName: "kube-api-access-2r6qz") pod "471bebf2-eaee-4a25-8e44-558ac2c4c2d4" (UID: "471bebf2-eaee-4a25-8e44-558ac2c4c2d4"). InnerVolumeSpecName "kube-api-access-2r6qz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:29:22.106402 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.106380 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-util" (OuterVolumeSpecName: "util") pod "471bebf2-eaee-4a25-8e44-558ac2c4c2d4" (UID: "471bebf2-eaee-4a25-8e44-558ac2c4c2d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:29:22.201750 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.201722 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:29:22.201750 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.201746 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:29:22.201750 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.201755 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2r6qz\" (UniqueName: \"kubernetes.io/projected/471bebf2-eaee-4a25-8e44-558ac2c4c2d4-kube-api-access-2r6qz\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:29:22.829798 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.829758 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" event={"ID":"471bebf2-eaee-4a25-8e44-558ac2c4c2d4","Type":"ContainerDied","Data":"573aa3b2e74ca2775310caee0001e008bfe5864bdfd0fa8fed7d3fd42cbccc4d"} Apr 17 14:29:22.829798 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.829777 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c22r6cg" Apr 17 14:29:22.829798 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:22.829792 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="573aa3b2e74ca2775310caee0001e008bfe5864bdfd0fa8fed7d3fd42cbccc4d" Apr 17 14:29:32.549122 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549053 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw"] Apr 17 14:29:32.549562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549342 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerName="extract" Apr 17 14:29:32.549562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549354 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerName="extract" Apr 17 14:29:32.549562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549367 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerName="util" Apr 17 14:29:32.549562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549372 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerName="util" Apr 17 14:29:32.549562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549382 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerName="pull" Apr 17 14:29:32.549562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549387 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerName="pull" Apr 17 14:29:32.549562 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.549424 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="471bebf2-eaee-4a25-8e44-558ac2c4c2d4" containerName="extract" Apr 17 14:29:32.553157 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.553133 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.556108 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.556085 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 14:29:32.556302 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.556267 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-dwxcc\"" Apr 17 14:29:32.556368 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.556338 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 14:29:32.556425 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.556346 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 14:29:32.565535 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.565514 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw"] Apr 17 14:29:32.674319 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674287 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/01124663-3743-45ae-a6fa-311bde303bce-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674475 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674353 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674475 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674475 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674454 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbbr\" (UniqueName: \"kubernetes.io/projected/01124663-3743-45ae-a6fa-311bde303bce-kube-api-access-lwbbr\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674591 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/01124663-3743-45ae-a6fa-311bde303bce-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674591 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01124663-3743-45ae-a6fa-311bde303bce-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674591 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674591 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.674591 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.674573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.775767 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.775737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.775920 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.775779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbbr\" (UniqueName: \"kubernetes.io/projected/01124663-3743-45ae-a6fa-311bde303bce-kube-api-access-lwbbr\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.775920 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.775904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/01124663-3743-45ae-a6fa-311bde303bce-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776024 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.775957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01124663-3743-45ae-a6fa-311bde303bce-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776024 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.775989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776121 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776121 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776121 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776107 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/01124663-3743-45ae-a6fa-311bde303bce-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776265 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776347 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776402 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776369 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776462 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776659 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776634 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.776798 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.776662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/01124663-3743-45ae-a6fa-311bde303bce-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.778472 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.778449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/01124663-3743-45ae-a6fa-311bde303bce-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.778472 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.778468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01124663-3743-45ae-a6fa-311bde303bce-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.786797 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.786770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbbr\" (UniqueName: \"kubernetes.io/projected/01124663-3743-45ae-a6fa-311bde303bce-kube-api-access-lwbbr\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.786899 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.786884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/01124663-3743-45ae-a6fa-311bde303bce-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw\" (UID: \"01124663-3743-45ae-a6fa-311bde303bce\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.862684 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.862665 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:32.992072 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:32.992047 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw"] Apr 17 14:29:32.994069 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:29:32.994042 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01124663_3743_45ae_a6fa_311bde303bce.slice/crio-12171fd84fd5183dd4d2f30cfb4b27c822578fc4941be308f4437e7bc1e9176f WatchSource:0}: Error finding container 12171fd84fd5183dd4d2f30cfb4b27c822578fc4941be308f4437e7bc1e9176f: Status 404 returned error can't find the container with id 12171fd84fd5183dd4d2f30cfb4b27c822578fc4941be308f4437e7bc1e9176f Apr 17 14:29:33.870056 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:33.870013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" event={"ID":"01124663-3743-45ae-a6fa-311bde303bce","Type":"ContainerStarted","Data":"12171fd84fd5183dd4d2f30cfb4b27c822578fc4941be308f4437e7bc1e9176f"} Apr 17 14:29:35.767916 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:35.767871 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:29:35.768138 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:35.767954 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:29:35.768138 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:35.767982 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:29:35.878860 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:35.878824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" event={"ID":"01124663-3743-45ae-a6fa-311bde303bce","Type":"ContainerStarted","Data":"802c3a12aaecf642124a211e2ae833c81e17f9b88388f828a04d28a6d06da36c"} Apr 17 14:29:35.899404 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:35.899353 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" podStartSLOduration=1.127561448 podStartE2EDuration="3.899340056s" podCreationTimestamp="2026-04-17 14:29:32 +0000 UTC" firstStartedPulling="2026-04-17 14:29:32.995807691 +0000 UTC m=+528.131815218" lastFinishedPulling="2026-04-17 14:29:35.767586289 +0000 UTC m=+530.903593826" observedRunningTime="2026-04-17 14:29:35.897119311 +0000 UTC m=+531.033126861" watchObservedRunningTime="2026-04-17 14:29:35.899340056 +0000 UTC m=+531.035347603" Apr 17 14:29:36.863517 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:36.863482 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:36.867982 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:36.867957 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:36.883408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:36.883383 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:29:36.883992 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:29:36.883973 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw" Apr 17 14:30:05.928520 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:05.928487 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pfc54"] Apr 17 14:30:05.937658 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:05.937634 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" Apr 17 14:30:05.940656 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:05.940629 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 14:30:05.942095 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:05.942073 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 14:30:05.942494 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:05.942099 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-d49tf\"" Apr 17 14:30:05.945206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:05.944997 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pfc54"] Apr 17 14:30:06.007205 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.007181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qpxx\" (UniqueName: \"kubernetes.io/projected/389cf8ed-834a-450c-a2c1-bda542e4cbe1-kube-api-access-8qpxx\") pod \"kuadrant-operator-catalog-pfc54\" (UID: \"389cf8ed-834a-450c-a2c1-bda542e4cbe1\") " pod="kuadrant-system/kuadrant-operator-catalog-pfc54" Apr 17 14:30:06.108439 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.108403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qpxx\" (UniqueName: \"kubernetes.io/projected/389cf8ed-834a-450c-a2c1-bda542e4cbe1-kube-api-access-8qpxx\") pod \"kuadrant-operator-catalog-pfc54\" (UID: \"389cf8ed-834a-450c-a2c1-bda542e4cbe1\") " pod="kuadrant-system/kuadrant-operator-catalog-pfc54" Apr 17 14:30:06.116166 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.116140 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qpxx\" (UniqueName: \"kubernetes.io/projected/389cf8ed-834a-450c-a2c1-bda542e4cbe1-kube-api-access-8qpxx\") pod \"kuadrant-operator-catalog-pfc54\" (UID: \"389cf8ed-834a-450c-a2c1-bda542e4cbe1\") " pod="kuadrant-system/kuadrant-operator-catalog-pfc54" Apr 17 14:30:06.248241 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.248185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" Apr 17 14:30:06.295361 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.295328 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pfc54"] Apr 17 14:30:06.365291 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.365245 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pfc54"] Apr 17 14:30:06.367040 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:06.366994 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389cf8ed_834a_450c_a2c1_bda542e4cbe1.slice/crio-f1f8f9962fe1846bd4bb38103f85688f1afc0de4ed437d3a413f1eadf7a5c8fa WatchSource:0}: Error finding container f1f8f9962fe1846bd4bb38103f85688f1afc0de4ed437d3a413f1eadf7a5c8fa: Status 404 returned error can't find the container with id f1f8f9962fe1846bd4bb38103f85688f1afc0de4ed437d3a413f1eadf7a5c8fa Apr 17 14:30:06.502793 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.502729 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rcxsj"] Apr 17 14:30:06.505398 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.505379 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:06.511619 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.511594 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rcxsj"] Apr 17 14:30:06.612047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.612018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7ttn\" (UniqueName: \"kubernetes.io/projected/c7c6cb74-41c3-441f-9f18-37a646366315-kube-api-access-k7ttn\") pod \"kuadrant-operator-catalog-rcxsj\" (UID: \"c7c6cb74-41c3-441f-9f18-37a646366315\") " pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:06.712757 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.712728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7ttn\" (UniqueName: \"kubernetes.io/projected/c7c6cb74-41c3-441f-9f18-37a646366315-kube-api-access-k7ttn\") pod \"kuadrant-operator-catalog-rcxsj\" (UID: \"c7c6cb74-41c3-441f-9f18-37a646366315\") " pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:06.720366 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.720346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7ttn\" (UniqueName: \"kubernetes.io/projected/c7c6cb74-41c3-441f-9f18-37a646366315-kube-api-access-k7ttn\") pod \"kuadrant-operator-catalog-rcxsj\" (UID: \"c7c6cb74-41c3-441f-9f18-37a646366315\") " pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:06.814638 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.814571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:06.926613 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.926591 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-rcxsj"] Apr 17 14:30:06.928022 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:06.927997 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c6cb74_41c3_441f_9f18_37a646366315.slice/crio-a4734fa9e5dc568ff3c9ebbd33382fdb069d3c09b7d0bd740687ef33f487e267 WatchSource:0}: Error finding container a4734fa9e5dc568ff3c9ebbd33382fdb069d3c09b7d0bd740687ef33f487e267: Status 404 returned error can't find the container with id a4734fa9e5dc568ff3c9ebbd33382fdb069d3c09b7d0bd740687ef33f487e267 Apr 17 14:30:06.980665 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.980639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" event={"ID":"389cf8ed-834a-450c-a2c1-bda542e4cbe1","Type":"ContainerStarted","Data":"f1f8f9962fe1846bd4bb38103f85688f1afc0de4ed437d3a413f1eadf7a5c8fa"} Apr 17 14:30:06.984081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:06.984057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" event={"ID":"c7c6cb74-41c3-441f-9f18-37a646366315","Type":"ContainerStarted","Data":"a4734fa9e5dc568ff3c9ebbd33382fdb069d3c09b7d0bd740687ef33f487e267"} Apr 17 14:30:08.993883 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:08.993791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" event={"ID":"c7c6cb74-41c3-441f-9f18-37a646366315","Type":"ContainerStarted","Data":"863fa60642fe89bb2ae61743f853c21f515e0dd5520b4b62ca140542ba2caf8b"} Apr 17 14:30:08.995034 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:08.995009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" event={"ID":"389cf8ed-834a-450c-a2c1-bda542e4cbe1","Type":"ContainerStarted","Data":"2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad"} Apr 17 14:30:08.995167 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:08.995117 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" podUID="389cf8ed-834a-450c-a2c1-bda542e4cbe1" containerName="registry-server" containerID="cri-o://2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad" gracePeriod=2 Apr 17 14:30:09.008587 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.008548 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" podStartSLOduration=1.3410223000000001 podStartE2EDuration="3.008535441s" podCreationTimestamp="2026-04-17 14:30:06 +0000 UTC" firstStartedPulling="2026-04-17 14:30:06.929369066 +0000 UTC m=+562.065376592" lastFinishedPulling="2026-04-17 14:30:08.596882206 +0000 UTC m=+563.732889733" observedRunningTime="2026-04-17 14:30:09.007860788 +0000 UTC m=+564.143868336" watchObservedRunningTime="2026-04-17 14:30:09.008535441 +0000 UTC m=+564.144542988" Apr 17 14:30:09.023701 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.023664 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" podStartSLOduration=1.7910985560000001 podStartE2EDuration="4.023651413s" podCreationTimestamp="2026-04-17 14:30:05 +0000 UTC" firstStartedPulling="2026-04-17 14:30:06.368496712 +0000 UTC m=+561.504504254" lastFinishedPulling="2026-04-17 14:30:08.601049585 +0000 UTC m=+563.737057111" observedRunningTime="2026-04-17 14:30:09.021974729 +0000 UTC m=+564.157982276" watchObservedRunningTime="2026-04-17 14:30:09.023651413 +0000 UTC m=+564.159659002" Apr 17 14:30:09.225371 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.225349 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" Apr 17 14:30:09.335322 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.335295 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qpxx\" (UniqueName: \"kubernetes.io/projected/389cf8ed-834a-450c-a2c1-bda542e4cbe1-kube-api-access-8qpxx\") pod \"389cf8ed-834a-450c-a2c1-bda542e4cbe1\" (UID: \"389cf8ed-834a-450c-a2c1-bda542e4cbe1\") " Apr 17 14:30:09.337332 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.337303 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389cf8ed-834a-450c-a2c1-bda542e4cbe1-kube-api-access-8qpxx" (OuterVolumeSpecName: "kube-api-access-8qpxx") pod "389cf8ed-834a-450c-a2c1-bda542e4cbe1" (UID: "389cf8ed-834a-450c-a2c1-bda542e4cbe1"). InnerVolumeSpecName "kube-api-access-8qpxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:09.436247 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.436213 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qpxx\" (UniqueName: \"kubernetes.io/projected/389cf8ed-834a-450c-a2c1-bda542e4cbe1-kube-api-access-8qpxx\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:09.999199 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.999120 2577 generic.go:358] "Generic (PLEG): container finished" podID="389cf8ed-834a-450c-a2c1-bda542e4cbe1" containerID="2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad" exitCode=0 Apr 17 14:30:09.999199 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.999174 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" Apr 17 14:30:09.999638 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.999201 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" event={"ID":"389cf8ed-834a-450c-a2c1-bda542e4cbe1","Type":"ContainerDied","Data":"2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad"} Apr 17 14:30:09.999638 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.999236 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pfc54" event={"ID":"389cf8ed-834a-450c-a2c1-bda542e4cbe1","Type":"ContainerDied","Data":"f1f8f9962fe1846bd4bb38103f85688f1afc0de4ed437d3a413f1eadf7a5c8fa"} Apr 17 14:30:09.999638 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:09.999253 2577 scope.go:117] "RemoveContainer" containerID="2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad" Apr 17 14:30:10.007553 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:10.007538 2577 scope.go:117] "RemoveContainer" containerID="2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad" Apr 17 14:30:10.007791 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:30:10.007772 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad\": container with ID starting with 2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad not found: ID does not exist" containerID="2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad" Apr 17 14:30:10.007841 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:10.007799 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad"} err="failed to get container status \"2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad\": rpc error: code = NotFound desc = could not find container \"2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad\": container with ID starting with 2670357a2012f7975fc800403c8fd4b8d44ddba5d15315429507ba45c1301fad not found: ID does not exist" Apr 17 14:30:10.015345 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:10.015322 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pfc54"] Apr 17 14:30:10.019266 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:10.019246 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pfc54"] Apr 17 14:30:11.372131 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:11.372100 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389cf8ed-834a-450c-a2c1-bda542e4cbe1" path="/var/lib/kubelet/pods/389cf8ed-834a-450c-a2c1-bda542e4cbe1/volumes" Apr 17 14:30:16.815521 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:16.815483 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:16.815946 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:16.815628 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:16.836479 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:16.836458 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:17.041285 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:17.041242 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-rcxsj" Apr 17 14:30:21.333079 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.333045 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn"] Apr 17 14:30:21.333433 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.333334 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389cf8ed-834a-450c-a2c1-bda542e4cbe1" containerName="registry-server" Apr 17 14:30:21.333433 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.333345 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="389cf8ed-834a-450c-a2c1-bda542e4cbe1" containerName="registry-server" Apr 17 14:30:21.333433 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.333386 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="389cf8ed-834a-450c-a2c1-bda542e4cbe1" containerName="registry-server" Apr 17 14:30:21.336203 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.336185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.338844 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.338825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ngzg4\"" Apr 17 14:30:21.343414 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.343394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn"] Apr 17 14:30:21.419786 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.419757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.419890 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.419788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2m5m\" (UniqueName: \"kubernetes.io/projected/de2a72e9-9dc9-42da-92da-6ec4e6d31130-kube-api-access-v2m5m\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.419952 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.419935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.521301 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.521261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.521412 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.521324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2m5m\" (UniqueName: \"kubernetes.io/projected/de2a72e9-9dc9-42da-92da-6ec4e6d31130-kube-api-access-v2m5m\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.521412 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.521358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.521655 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.521634 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.521696 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.521638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.529299 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.529227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2m5m\" (UniqueName: \"kubernetes.io/projected/de2a72e9-9dc9-42da-92da-6ec4e6d31130-kube-api-access-v2m5m\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.646059 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.646034 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:21.737970 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.737941 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr"] Apr 17 14:30:21.741406 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.741384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.749589 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.749557 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr"] Apr 17 14:30:21.763492 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.763468 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn"] Apr 17 14:30:21.766226 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:21.766204 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2a72e9_9dc9_42da_92da_6ec4e6d31130.slice/crio-db362e603750133edeaf43e02e876ddf0bcade4890f580871fe40c3a4c40f465 WatchSource:0}: Error finding container db362e603750133edeaf43e02e876ddf0bcade4890f580871fe40c3a4c40f465: Status 404 returned error can't find the container with id db362e603750133edeaf43e02e876ddf0bcade4890f580871fe40c3a4c40f465 Apr 17 14:30:21.823681 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.823659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.823784 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.823690 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4qk\" (UniqueName: \"kubernetes.io/projected/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-kube-api-access-lq4qk\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.823823 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.823802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.925045 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.925018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.925161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.925055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4qk\" (UniqueName: \"kubernetes.io/projected/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-kube-api-access-lq4qk\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.925161 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.925140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.925371 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.925350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.925480 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.925461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:21.932782 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:21.932760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4qk\" (UniqueName: \"kubernetes.io/projected/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-kube-api-access-lq4qk\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:22.040714 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.040691 2577 generic.go:358] "Generic (PLEG): container finished" podID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerID="9a223832f00112fa6d1fa14e7f9d75df5fe64632b790940459b027f2a78fcd55" exitCode=0 Apr 17 14:30:22.040860 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.040780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" event={"ID":"de2a72e9-9dc9-42da-92da-6ec4e6d31130","Type":"ContainerDied","Data":"9a223832f00112fa6d1fa14e7f9d75df5fe64632b790940459b027f2a78fcd55"} Apr 17 14:30:22.040860 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.040810 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" event={"ID":"de2a72e9-9dc9-42da-92da-6ec4e6d31130","Type":"ContainerStarted","Data":"db362e603750133edeaf43e02e876ddf0bcade4890f580871fe40c3a4c40f465"} Apr 17 14:30:22.051669 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.051649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:22.162061 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.161986 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr"] Apr 17 14:30:22.164053 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:22.164027 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eb3f50d_9f0b_481f_aaf2_d8f3758f9255.slice/crio-027da84bcc52ee10ea00845ec00a118b425a4216e3cc201af7bb2f0c6423f99a WatchSource:0}: Error finding container 027da84bcc52ee10ea00845ec00a118b425a4216e3cc201af7bb2f0c6423f99a: Status 404 returned error can't find the container with id 027da84bcc52ee10ea00845ec00a118b425a4216e3cc201af7bb2f0c6423f99a Apr 17 14:30:22.331736 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.331706 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2"] Apr 17 14:30:22.333696 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.333679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.342444 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.342422 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2"] Apr 17 14:30:22.428752 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.428718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggcs\" (UniqueName: \"kubernetes.io/projected/b5c19c02-bef5-4ade-b289-4baa372d80b6-kube-api-access-lggcs\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.428885 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.428757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.428885 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.428775 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.529852 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.529829 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lggcs\" (UniqueName: \"kubernetes.io/projected/b5c19c02-bef5-4ade-b289-4baa372d80b6-kube-api-access-lggcs\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.529974 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.529860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.529974 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.529878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.530199 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.530183 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.530306 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.530228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.538031 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.538011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggcs\" (UniqueName: \"kubernetes.io/projected/b5c19c02-bef5-4ade-b289-4baa372d80b6-kube-api-access-lggcs\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.643892 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.643825 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:22.762643 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.762618 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2"] Apr 17 14:30:22.806944 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:22.806908 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c19c02_bef5_4ade_b289_4baa372d80b6.slice/crio-08d03a3e6fdf286507832703224072c179a5c603eb982db3439b100f5fd7a7cb WatchSource:0}: Error finding container 08d03a3e6fdf286507832703224072c179a5c603eb982db3439b100f5fd7a7cb: Status 404 returned error can't find the container with id 08d03a3e6fdf286507832703224072c179a5c603eb982db3439b100f5fd7a7cb Apr 17 14:30:22.940102 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.940075 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t"] Apr 17 14:30:22.942178 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.942159 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:22.950137 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:22.950117 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t"] Apr 17 14:30:23.033468 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.033446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.033565 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.033479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhpd\" (UniqueName: \"kubernetes.io/projected/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-kube-api-access-2zhpd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.033630 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.033609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.046053 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.046029 2577 generic.go:358] "Generic (PLEG): container finished" podID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerID="03931740b7f714989c0bffab243fee589192b02b5128ce9c0dad4ba82c5cc000" exitCode=0 Apr 17 14:30:23.046141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.046096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" event={"ID":"de2a72e9-9dc9-42da-92da-6ec4e6d31130","Type":"ContainerDied","Data":"03931740b7f714989c0bffab243fee589192b02b5128ce9c0dad4ba82c5cc000"} Apr 17 14:30:23.047323 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.047301 2577 generic.go:358] "Generic (PLEG): container finished" podID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerID="231e87a0352e024609c2d9e6554b45e1d1b1aafe070a06e97befe66cabac9577" exitCode=0 Apr 17 14:30:23.047407 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.047373 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" event={"ID":"b5c19c02-bef5-4ade-b289-4baa372d80b6","Type":"ContainerDied","Data":"231e87a0352e024609c2d9e6554b45e1d1b1aafe070a06e97befe66cabac9577"} Apr 17 14:30:23.047407 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.047398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" event={"ID":"b5c19c02-bef5-4ade-b289-4baa372d80b6","Type":"ContainerStarted","Data":"08d03a3e6fdf286507832703224072c179a5c603eb982db3439b100f5fd7a7cb"} Apr 17 14:30:23.048691 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.048674 2577 generic.go:358] "Generic (PLEG): container finished" podID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerID="8fd63017cd7101e239e41ca3f2fe2812b7d8ffaf5d7428348898b3a1e2d3a631" exitCode=0 Apr 17 14:30:23.048776 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.048740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" event={"ID":"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255","Type":"ContainerDied","Data":"8fd63017cd7101e239e41ca3f2fe2812b7d8ffaf5d7428348898b3a1e2d3a631"} Apr 17 14:30:23.048776 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.048755 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" event={"ID":"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255","Type":"ContainerStarted","Data":"027da84bcc52ee10ea00845ec00a118b425a4216e3cc201af7bb2f0c6423f99a"} Apr 17 14:30:23.134552 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.134509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.134747 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.134580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.134747 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.134632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhpd\" (UniqueName: \"kubernetes.io/projected/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-kube-api-access-2zhpd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.135073 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.135051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.135301 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.135260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.143760 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.143737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhpd\" (UniqueName: \"kubernetes.io/projected/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-kube-api-access-2zhpd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.297067 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.297044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:23.413183 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:23.413157 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t"] Apr 17 14:30:23.414657 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:23.414628 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8301aa03_f89c_4ea3_9cac_3e6b4e7a3ba9.slice/crio-5572fee8bd68e33fdb543a12ce96c4e34b22e7af3b84c27fd3e929e7d9f94c19 WatchSource:0}: Error finding container 5572fee8bd68e33fdb543a12ce96c4e34b22e7af3b84c27fd3e929e7d9f94c19: Status 404 returned error can't find the container with id 5572fee8bd68e33fdb543a12ce96c4e34b22e7af3b84c27fd3e929e7d9f94c19 Apr 17 14:30:24.055054 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:24.054967 2577 generic.go:358] "Generic (PLEG): container finished" podID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerID="2d06bea33fa12b64064bd0c3fb91eb36c6adc6f52131b192ab99c301fc9744d3" exitCode=0 Apr 17 14:30:24.055054 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:24.055034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" event={"ID":"de2a72e9-9dc9-42da-92da-6ec4e6d31130","Type":"ContainerDied","Data":"2d06bea33fa12b64064bd0c3fb91eb36c6adc6f52131b192ab99c301fc9744d3"} Apr 17 14:30:24.056451 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:24.056424 2577 generic.go:358] "Generic (PLEG): container finished" podID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerID="1e9eedeb14f5421a25e02c72477a685260be02e9c75d35c7b46ad4bbc13360c7" exitCode=0 Apr 17 14:30:24.056587 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:24.056504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" event={"ID":"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9","Type":"ContainerDied","Data":"1e9eedeb14f5421a25e02c72477a685260be02e9c75d35c7b46ad4bbc13360c7"} Apr 17 14:30:24.056587 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:24.056528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" event={"ID":"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9","Type":"ContainerStarted","Data":"5572fee8bd68e33fdb543a12ce96c4e34b22e7af3b84c27fd3e929e7d9f94c19"} Apr 17 14:30:24.061944 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:24.061921 2577 generic.go:358] "Generic (PLEG): container finished" podID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerID="57f5bb62faa5864211b6829abd07b15cc988d20396876f985409c4e3be4485bb" exitCode=0 Apr 17 14:30:24.062065 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:24.061986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" event={"ID":"b5c19c02-bef5-4ade-b289-4baa372d80b6","Type":"ContainerDied","Data":"57f5bb62faa5864211b6829abd07b15cc988d20396876f985409c4e3be4485bb"} Apr 17 14:30:25.066346 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.066309 2577 generic.go:358] "Generic (PLEG): container finished" podID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerID="39fcbe827e825008c9f6f54f2c138d3ed110437b96bbb216d027706569ed57ab" exitCode=0 Apr 17 14:30:25.066766 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.066394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" event={"ID":"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9","Type":"ContainerDied","Data":"39fcbe827e825008c9f6f54f2c138d3ed110437b96bbb216d027706569ed57ab"} Apr 17 14:30:25.068246 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.068227 2577 generic.go:358] "Generic (PLEG): container finished" podID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerID="d8cd3e0aae7522091cc5353bd5d0e8a820dc0f6f7675c70ba1bbb6f03eb820f1" exitCode=0 Apr 17 14:30:25.068347 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.068307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" event={"ID":"b5c19c02-bef5-4ade-b289-4baa372d80b6","Type":"ContainerDied","Data":"d8cd3e0aae7522091cc5353bd5d0e8a820dc0f6f7675c70ba1bbb6f03eb820f1"} Apr 17 14:30:25.069843 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.069824 2577 generic.go:358] "Generic (PLEG): container finished" podID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerID="74623acde238d5f625abfb0694e46184f05c39098523d8382fafd837269bbe72" exitCode=0 Apr 17 14:30:25.069944 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.069897 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" event={"ID":"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255","Type":"ContainerDied","Data":"74623acde238d5f625abfb0694e46184f05c39098523d8382fafd837269bbe72"} Apr 17 14:30:25.189380 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.189344 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:25.255251 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.255226 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-bundle\") pod \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " Apr 17 14:30:25.255376 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.255320 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2m5m\" (UniqueName: \"kubernetes.io/projected/de2a72e9-9dc9-42da-92da-6ec4e6d31130-kube-api-access-v2m5m\") pod \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " Apr 17 14:30:25.255376 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.255366 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-util\") pod \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\" (UID: \"de2a72e9-9dc9-42da-92da-6ec4e6d31130\") " Apr 17 14:30:25.255863 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.255828 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-bundle" (OuterVolumeSpecName: "bundle") pod "de2a72e9-9dc9-42da-92da-6ec4e6d31130" (UID: "de2a72e9-9dc9-42da-92da-6ec4e6d31130"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:25.257217 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.257193 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2a72e9-9dc9-42da-92da-6ec4e6d31130-kube-api-access-v2m5m" (OuterVolumeSpecName: "kube-api-access-v2m5m") pod "de2a72e9-9dc9-42da-92da-6ec4e6d31130" (UID: "de2a72e9-9dc9-42da-92da-6ec4e6d31130"). InnerVolumeSpecName "kube-api-access-v2m5m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:25.260755 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.260735 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-util" (OuterVolumeSpecName: "util") pod "de2a72e9-9dc9-42da-92da-6ec4e6d31130" (UID: "de2a72e9-9dc9-42da-92da-6ec4e6d31130"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:25.356263 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.356228 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:25.356263 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.356263 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de2a72e9-9dc9-42da-92da-6ec4e6d31130-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:25.356415 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:25.356301 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v2m5m\" (UniqueName: \"kubernetes.io/projected/de2a72e9-9dc9-42da-92da-6ec4e6d31130-kube-api-access-v2m5m\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:26.075342 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.075302 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" event={"ID":"de2a72e9-9dc9-42da-92da-6ec4e6d31130","Type":"ContainerDied","Data":"db362e603750133edeaf43e02e876ddf0bcade4890f580871fe40c3a4c40f465"} Apr 17 14:30:26.075342 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.075339 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db362e603750133edeaf43e02e876ddf0bcade4890f580871fe40c3a4c40f465" Apr 17 14:30:26.075840 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.075357 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn" Apr 17 14:30:26.077180 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.077155 2577 generic.go:358] "Generic (PLEG): container finished" podID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerID="73a2b2037fe72a479f0b274a6ec0e94dfbe18bae12603272e3ef9f7ec37ee423" exitCode=0 Apr 17 14:30:26.077338 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.077224 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" event={"ID":"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9","Type":"ContainerDied","Data":"73a2b2037fe72a479f0b274a6ec0e94dfbe18bae12603272e3ef9f7ec37ee423"} Apr 17 14:30:26.079065 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.079047 2577 generic.go:358] "Generic (PLEG): container finished" podID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerID="dff81b3647fab0fa92e7e5b63ea5e7441a4f06d82fb38ce72dc1c66c20ef0ff3" exitCode=0 Apr 17 14:30:26.079151 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.079122 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" event={"ID":"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255","Type":"ContainerDied","Data":"dff81b3647fab0fa92e7e5b63ea5e7441a4f06d82fb38ce72dc1c66c20ef0ff3"} Apr 17 14:30:26.199903 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.199881 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:26.278886 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.278843 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-bundle\") pod \"b5c19c02-bef5-4ade-b289-4baa372d80b6\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " Apr 17 14:30:26.279082 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.278916 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-util\") pod \"b5c19c02-bef5-4ade-b289-4baa372d80b6\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " Apr 17 14:30:26.279082 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.278941 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lggcs\" (UniqueName: \"kubernetes.io/projected/b5c19c02-bef5-4ade-b289-4baa372d80b6-kube-api-access-lggcs\") pod \"b5c19c02-bef5-4ade-b289-4baa372d80b6\" (UID: \"b5c19c02-bef5-4ade-b289-4baa372d80b6\") " Apr 17 14:30:26.279549 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.279519 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-bundle" (OuterVolumeSpecName: "bundle") pod "b5c19c02-bef5-4ade-b289-4baa372d80b6" (UID: "b5c19c02-bef5-4ade-b289-4baa372d80b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:26.280933 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.280912 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c19c02-bef5-4ade-b289-4baa372d80b6-kube-api-access-lggcs" (OuterVolumeSpecName: "kube-api-access-lggcs") pod "b5c19c02-bef5-4ade-b289-4baa372d80b6" (UID: "b5c19c02-bef5-4ade-b289-4baa372d80b6"). InnerVolumeSpecName "kube-api-access-lggcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:26.284024 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.283987 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-util" (OuterVolumeSpecName: "util") pod "b5c19c02-bef5-4ade-b289-4baa372d80b6" (UID: "b5c19c02-bef5-4ade-b289-4baa372d80b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:26.380203 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.380145 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:26.380203 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.380167 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lggcs\" (UniqueName: \"kubernetes.io/projected/b5c19c02-bef5-4ade-b289-4baa372d80b6-kube-api-access-lggcs\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:26.380203 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:26.380178 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5c19c02-bef5-4ade-b289-4baa372d80b6-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:27.084317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.084269 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" Apr 17 14:30:27.084317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.084299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2" event={"ID":"b5c19c02-bef5-4ade-b289-4baa372d80b6","Type":"ContainerDied","Data":"08d03a3e6fdf286507832703224072c179a5c603eb982db3439b100f5fd7a7cb"} Apr 17 14:30:27.084715 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.084333 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08d03a3e6fdf286507832703224072c179a5c603eb982db3439b100f5fd7a7cb" Apr 17 14:30:27.205552 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.205531 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:27.232630 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.232608 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:27.285864 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.285834 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-util\") pod \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " Apr 17 14:30:27.286005 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.285871 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-bundle\") pod \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " Apr 17 14:30:27.286005 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.285931 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zhpd\" (UniqueName: \"kubernetes.io/projected/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-kube-api-access-2zhpd\") pod \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\" (UID: \"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9\") " Apr 17 14:30:27.286391 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.286365 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-bundle" (OuterVolumeSpecName: "bundle") pod "8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" (UID: "8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:27.287895 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.287874 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-kube-api-access-2zhpd" (OuterVolumeSpecName: "kube-api-access-2zhpd") pod "8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" (UID: "8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9"). InnerVolumeSpecName "kube-api-access-2zhpd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:27.290933 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.290899 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-util" (OuterVolumeSpecName: "util") pod "8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" (UID: "8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:27.386244 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.386225 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq4qk\" (UniqueName: \"kubernetes.io/projected/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-kube-api-access-lq4qk\") pod \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " Apr 17 14:30:27.386359 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.386286 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-util\") pod \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " Apr 17 14:30:27.386359 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.386317 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-bundle\") pod \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\" (UID: \"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255\") " Apr 17 14:30:27.386469 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.386445 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zhpd\" (UniqueName: \"kubernetes.io/projected/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-kube-api-access-2zhpd\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:27.386469 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.386460 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:27.386563 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.386474 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:27.386756 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.386736 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-bundle" (OuterVolumeSpecName: "bundle") pod "1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" (UID: "1eb3f50d-9f0b-481f-aaf2-d8f3758f9255"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:27.388193 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.388171 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-kube-api-access-lq4qk" (OuterVolumeSpecName: "kube-api-access-lq4qk") pod "1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" (UID: "1eb3f50d-9f0b-481f-aaf2-d8f3758f9255"). InnerVolumeSpecName "kube-api-access-lq4qk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:27.391336 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.391306 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-util" (OuterVolumeSpecName: "util") pod "1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" (UID: "1eb3f50d-9f0b-481f-aaf2-d8f3758f9255"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:30:27.487170 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.487149 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lq4qk\" (UniqueName: \"kubernetes.io/projected/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-kube-api-access-lq4qk\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:27.487170 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.487169 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-util\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:27.487301 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:27.487179 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1eb3f50d-9f0b-481f-aaf2-d8f3758f9255-bundle\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:28.089519 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:28.089482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" event={"ID":"8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9","Type":"ContainerDied","Data":"5572fee8bd68e33fdb543a12ce96c4e34b22e7af3b84c27fd3e929e7d9f94c19"} Apr 17 14:30:28.089519 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:28.089526 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5572fee8bd68e33fdb543a12ce96c4e34b22e7af3b84c27fd3e929e7d9f94c19" Apr 17 14:30:28.090021 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:28.089501 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t" Apr 17 14:30:28.091312 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:28.091260 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" Apr 17 14:30:28.091426 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:28.091269 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr" event={"ID":"1eb3f50d-9f0b-481f-aaf2-d8f3758f9255","Type":"ContainerDied","Data":"027da84bcc52ee10ea00845ec00a118b425a4216e3cc201af7bb2f0c6423f99a"} Apr 17 14:30:28.091426 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:28.091344 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="027da84bcc52ee10ea00845ec00a118b425a4216e3cc201af7bb2f0c6423f99a" Apr 17 14:30:41.829348 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829315 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-srl6t"] Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829710 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerName="pull" Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829729 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerName="pull" Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829744 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerName="extract" Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829751 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerName="extract" Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829763 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerName="extract" Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829771 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerName="extract" Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829781 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerName="pull" Apr 17 14:30:41.829795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829787 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerName="pull" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829801 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829809 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829818 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerName="pull" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829825 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerName="pull" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829836 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829844 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829855 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829864 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829872 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerName="extract" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829879 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerName="extract" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829895 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerName="pull" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829902 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerName="pull" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829912 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829920 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerName="util" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829931 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerName="extract" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.829939 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerName="extract" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.830006 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5c19c02-bef5-4ade-b289-4baa372d80b6" containerName="extract" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.830017 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="de2a72e9-9dc9-42da-92da-6ec4e6d31130" containerName="extract" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.830026 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1eb3f50d-9f0b-481f-aaf2-d8f3758f9255" containerName="extract" Apr 17 14:30:41.830164 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.830036 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9" containerName="extract" Apr 17 14:30:41.833234 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.833216 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-srl6t" Apr 17 14:30:41.838374 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.838348 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-frjhh\"" Apr 17 14:30:41.847658 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.847635 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-srl6t"] Apr 17 14:30:41.985049 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:41.985024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9nn\" (UniqueName: \"kubernetes.io/projected/de3b7e26-c63e-4f0e-a493-a47aa30ae72d-kube-api-access-xz9nn\") pod \"authorino-operator-657f44b778-srl6t\" (UID: \"de3b7e26-c63e-4f0e-a493-a47aa30ae72d\") " pod="kuadrant-system/authorino-operator-657f44b778-srl6t" Apr 17 14:30:42.086403 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:42.086344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9nn\" (UniqueName: \"kubernetes.io/projected/de3b7e26-c63e-4f0e-a493-a47aa30ae72d-kube-api-access-xz9nn\") pod \"authorino-operator-657f44b778-srl6t\" (UID: \"de3b7e26-c63e-4f0e-a493-a47aa30ae72d\") " pod="kuadrant-system/authorino-operator-657f44b778-srl6t" Apr 17 14:30:42.097229 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:42.097207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9nn\" (UniqueName: \"kubernetes.io/projected/de3b7e26-c63e-4f0e-a493-a47aa30ae72d-kube-api-access-xz9nn\") pod \"authorino-operator-657f44b778-srl6t\" (UID: \"de3b7e26-c63e-4f0e-a493-a47aa30ae72d\") " pod="kuadrant-system/authorino-operator-657f44b778-srl6t" Apr 17 14:30:42.142372 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:42.142348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-srl6t" Apr 17 14:30:42.259802 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:42.259778 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-srl6t"] Apr 17 14:30:42.261452 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:42.261424 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde3b7e26_c63e_4f0e_a493_a47aa30ae72d.slice/crio-76448c6ea8492aed6edc78a40feba7148963f34d659b2034033c0dd626f95665 WatchSource:0}: Error finding container 76448c6ea8492aed6edc78a40feba7148963f34d659b2034033c0dd626f95665: Status 404 returned error can't find the container with id 76448c6ea8492aed6edc78a40feba7148963f34d659b2034033c0dd626f95665 Apr 17 14:30:43.144341 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:43.144303 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-srl6t" event={"ID":"de3b7e26-c63e-4f0e-a493-a47aa30ae72d","Type":"ContainerStarted","Data":"76448c6ea8492aed6edc78a40feba7148963f34d659b2034033c0dd626f95665"} Apr 17 14:30:45.154420 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:45.154385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-srl6t" event={"ID":"de3b7e26-c63e-4f0e-a493-a47aa30ae72d","Type":"ContainerStarted","Data":"373b5de3aab3e427481e00edb97debaa0590dbd1a0de3a2e77d61701d9ffaee0"} Apr 17 14:30:45.154831 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:45.154487 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-srl6t" Apr 17 14:30:45.178594 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:45.178547 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-srl6t" podStartSLOduration=2.03720266 podStartE2EDuration="4.178534509s" podCreationTimestamp="2026-04-17 14:30:41 +0000 UTC" firstStartedPulling="2026-04-17 14:30:42.26340172 +0000 UTC m=+597.399409246" lastFinishedPulling="2026-04-17 14:30:44.404733566 +0000 UTC m=+599.540741095" observedRunningTime="2026-04-17 14:30:45.177662367 +0000 UTC m=+600.313669914" watchObservedRunningTime="2026-04-17 14:30:45.178534509 +0000 UTC m=+600.314542057" Apr 17 14:30:46.887112 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:46.887076 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx"] Apr 17 14:30:46.891539 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:46.891523 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:46.894480 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:46.894461 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-9kjcm\"" Apr 17 14:30:46.900033 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:46.900006 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx"] Apr 17 14:30:47.021867 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:47.021840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc89\" (UniqueName: \"kubernetes.io/projected/ceef9448-0d36-4164-bf90-ed239b9fa06a-kube-api-access-4kc89\") pod \"limitador-operator-controller-manager-85c4996f8c-s26qx\" (UID: \"ceef9448-0d36-4164-bf90-ed239b9fa06a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:47.123059 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:47.123032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc89\" (UniqueName: \"kubernetes.io/projected/ceef9448-0d36-4164-bf90-ed239b9fa06a-kube-api-access-4kc89\") pod \"limitador-operator-controller-manager-85c4996f8c-s26qx\" (UID: \"ceef9448-0d36-4164-bf90-ed239b9fa06a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:47.133432 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:47.133406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc89\" (UniqueName: \"kubernetes.io/projected/ceef9448-0d36-4164-bf90-ed239b9fa06a-kube-api-access-4kc89\") pod \"limitador-operator-controller-manager-85c4996f8c-s26qx\" (UID: \"ceef9448-0d36-4164-bf90-ed239b9fa06a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:47.201352 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:47.201294 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:47.320316 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:47.320160 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx"] Apr 17 14:30:47.322444 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:30:47.322418 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceef9448_0d36_4164_bf90_ed239b9fa06a.slice/crio-a4930fbffe272125a211fca74d4217c42b2c82483ca78d421e0853e0cc209bac WatchSource:0}: Error finding container a4930fbffe272125a211fca74d4217c42b2c82483ca78d421e0853e0cc209bac: Status 404 returned error can't find the container with id a4930fbffe272125a211fca74d4217c42b2c82483ca78d421e0853e0cc209bac Apr 17 14:30:48.164893 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:48.164862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" event={"ID":"ceef9448-0d36-4164-bf90-ed239b9fa06a","Type":"ContainerStarted","Data":"a4930fbffe272125a211fca74d4217c42b2c82483ca78d421e0853e0cc209bac"} Apr 17 14:30:49.169464 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:49.169430 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" event={"ID":"ceef9448-0d36-4164-bf90-ed239b9fa06a","Type":"ContainerStarted","Data":"17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23"} Apr 17 14:30:49.169772 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:49.169569 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:49.188025 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:49.187987 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" podStartSLOduration=1.436945796 podStartE2EDuration="3.187974698s" podCreationTimestamp="2026-04-17 14:30:46 +0000 UTC" firstStartedPulling="2026-04-17 14:30:47.324259864 +0000 UTC m=+602.460267391" lastFinishedPulling="2026-04-17 14:30:49.075288766 +0000 UTC m=+604.211296293" observedRunningTime="2026-04-17 14:30:49.186718188 +0000 UTC m=+604.322725736" watchObservedRunningTime="2026-04-17 14:30:49.187974698 +0000 UTC m=+604.323982245" Apr 17 14:30:56.160148 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:56.160121 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-srl6t" Apr 17 14:30:58.258143 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.258066 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx"] Apr 17 14:30:58.258553 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.258383 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" containerName="manager" containerID="cri-o://17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23" gracePeriod=2 Apr 17 14:30:58.260163 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.260142 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:58.277355 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.277330 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx"] Apr 17 14:30:58.478583 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.478562 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:58.481080 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.481052 2577 status_manager.go:895] "Failed to get status for pod" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" err="pods \"limitador-operator-controller-manager-85c4996f8c-s26qx\" is forbidden: User \"system:node:ip-10-0-132-119.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-119.ec2.internal' and this object" Apr 17 14:30:58.505371 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.505350 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kc89\" (UniqueName: \"kubernetes.io/projected/ceef9448-0d36-4164-bf90-ed239b9fa06a-kube-api-access-4kc89\") pod \"ceef9448-0d36-4164-bf90-ed239b9fa06a\" (UID: \"ceef9448-0d36-4164-bf90-ed239b9fa06a\") " Apr 17 14:30:58.507285 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.507251 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceef9448-0d36-4164-bf90-ed239b9fa06a-kube-api-access-4kc89" (OuterVolumeSpecName: "kube-api-access-4kc89") pod "ceef9448-0d36-4164-bf90-ed239b9fa06a" (UID: "ceef9448-0d36-4164-bf90-ed239b9fa06a"). InnerVolumeSpecName "kube-api-access-4kc89". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:30:58.605783 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:58.605760 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kc89\" (UniqueName: \"kubernetes.io/projected/ceef9448-0d36-4164-bf90-ed239b9fa06a-kube-api-access-4kc89\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:30:59.205131 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.205097 2577 generic.go:358] "Generic (PLEG): container finished" podID="ceef9448-0d36-4164-bf90-ed239b9fa06a" containerID="17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23" exitCode=0 Apr 17 14:30:59.205311 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.205144 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" Apr 17 14:30:59.205311 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.205208 2577 scope.go:117] "RemoveContainer" containerID="17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23" Apr 17 14:30:59.207911 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.207880 2577 status_manager.go:895] "Failed to get status for pod" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" err="pods \"limitador-operator-controller-manager-85c4996f8c-s26qx\" is forbidden: User \"system:node:ip-10-0-132-119.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-119.ec2.internal' and this object" Apr 17 14:30:59.213909 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.213891 2577 scope.go:117] "RemoveContainer" containerID="17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23" Apr 17 14:30:59.214132 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:30:59.214113 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23\": container with ID starting with 17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23 not found: ID does not exist" containerID="17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23" Apr 17 14:30:59.214180 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.214139 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23"} err="failed to get container status \"17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23\": rpc error: code = NotFound desc = could not find container \"17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23\": container with ID starting with 17d0dc844568034e0f61f0431ba05129cc661f55fd25d4f5ec6102f35ffb4a23 not found: ID does not exist" Apr 17 14:30:59.216098 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.216073 2577 status_manager.go:895] "Failed to get status for pod" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-s26qx" err="pods \"limitador-operator-controller-manager-85c4996f8c-s26qx\" is forbidden: User \"system:node:ip-10-0-132-119.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-119.ec2.internal' and this object" Apr 17 14:30:59.371866 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:30:59.371843 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" path="/var/lib/kubelet/pods/ceef9448-0d36-4164-bf90-ed239b9fa06a/volumes" Apr 17 14:31:27.683490 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.683457 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl"] Apr 17 14:31:27.685735 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.683709 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" containerName="manager" Apr 17 14:31:27.685735 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.683719 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" containerName="manager" Apr 17 14:31:27.685735 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.683774 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ceef9448-0d36-4164-bf90-ed239b9fa06a" containerName="manager" Apr 17 14:31:27.686609 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.686594 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.689148 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.689130 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-wnrf9\"" Apr 17 14:31:27.698798 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.698775 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl"] Apr 17 14:31:27.796541 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsq6\" (UniqueName: \"kubernetes.io/projected/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-kube-api-access-vrsq6\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.796955 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.796700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898148 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898148 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898168 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898408 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898400 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsq6\" (UniqueName: \"kubernetes.io/projected/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-kube-api-access-vrsq6\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898804 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898804 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898696 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898804 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.898950 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.898918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.899199 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.899180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.900401 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.900380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.900644 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.900626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.906074 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.906042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.906183 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.906163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsq6\" (UniqueName: \"kubernetes.io/projected/9dfa6f2e-76f9-430c-a6a4-da7ac010f854-kube-api-access-vrsq6\") pod \"maas-default-gateway-openshift-default-58b6f876-qsghl\" (UID: \"9dfa6f2e-76f9-430c-a6a4-da7ac010f854\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:27.997311 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:27.997217 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:28.118442 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.118392 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl"] Apr 17 14:31:28.120485 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:31:28.120458 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dfa6f2e_76f9_430c_a6a4_da7ac010f854.slice/crio-af67ab3fe32e496ce959955602e06d4883e91624769909b34f8a0eeabfed3cfb WatchSource:0}: Error finding container af67ab3fe32e496ce959955602e06d4883e91624769909b34f8a0eeabfed3cfb: Status 404 returned error can't find the container with id af67ab3fe32e496ce959955602e06d4883e91624769909b34f8a0eeabfed3cfb Apr 17 14:31:28.122482 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.122451 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:31:28.122565 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.122512 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:31:28.122565 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.122539 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 14:31:28.306545 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.306456 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" event={"ID":"9dfa6f2e-76f9-430c-a6a4-da7ac010f854","Type":"ContainerStarted","Data":"c6b4f2bdc9d01e55ecb4bf04855554df51a7eaffff5fc4751c5a9d72ec635035"} Apr 17 14:31:28.306545 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.306494 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" event={"ID":"9dfa6f2e-76f9-430c-a6a4-da7ac010f854","Type":"ContainerStarted","Data":"af67ab3fe32e496ce959955602e06d4883e91624769909b34f8a0eeabfed3cfb"} Apr 17 14:31:28.327086 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.327042 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" podStartSLOduration=1.327026226 podStartE2EDuration="1.327026226s" podCreationTimestamp="2026-04-17 14:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:31:28.324549824 +0000 UTC m=+643.460557374" watchObservedRunningTime="2026-04-17 14:31:28.327026226 +0000 UTC m=+643.463033773" Apr 17 14:31:28.997795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:28.997759 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:30.003369 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:30.003342 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:30.314397 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:30.314312 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:30.315100 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:30.315081 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-qsghl" Apr 17 14:31:31.991838 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:31.991760 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shw62"] Apr 17 14:31:31.994890 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:31.994875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:31.997458 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:31.997439 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 14:31:31.997632 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:31.997605 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ngzg4\"" Apr 17 14:31:32.002313 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.002295 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shw62"] Apr 17 14:31:32.028359 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.028340 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6b7n\" (UniqueName: \"kubernetes.io/projected/b931b1a7-77aa-4ff0-afc4-28da274ebeff-kube-api-access-g6b7n\") pod \"limitador-limitador-7d549b5b-shw62\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:32.028459 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.028370 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b931b1a7-77aa-4ff0-afc4-28da274ebeff-config-file\") pod \"limitador-limitador-7d549b5b-shw62\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:32.091752 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.091722 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shw62"] Apr 17 14:31:32.129134 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.129107 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6b7n\" (UniqueName: \"kubernetes.io/projected/b931b1a7-77aa-4ff0-afc4-28da274ebeff-kube-api-access-g6b7n\") pod \"limitador-limitador-7d549b5b-shw62\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:32.129261 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.129143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b931b1a7-77aa-4ff0-afc4-28da274ebeff-config-file\") pod \"limitador-limitador-7d549b5b-shw62\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:32.129702 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.129685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b931b1a7-77aa-4ff0-afc4-28da274ebeff-config-file\") pod \"limitador-limitador-7d549b5b-shw62\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:32.137450 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.137430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6b7n\" (UniqueName: \"kubernetes.io/projected/b931b1a7-77aa-4ff0-afc4-28da274ebeff-kube-api-access-g6b7n\") pod \"limitador-limitador-7d549b5b-shw62\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:32.306029 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.305946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:32.422921 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.422898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shw62"] Apr 17 14:31:32.424766 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:31:32.424736 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb931b1a7_77aa_4ff0_afc4_28da274ebeff.slice/crio-accb048ead8b5ff11c96efd520bbb539158633d0317500a33889dd4dbd93ee19 WatchSource:0}: Error finding container accb048ead8b5ff11c96efd520bbb539158633d0317500a33889dd4dbd93ee19: Status 404 returned error can't find the container with id accb048ead8b5ff11c96efd520bbb539158633d0317500a33889dd4dbd93ee19 Apr 17 14:31:32.797436 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.797404 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xfj7r"] Apr 17 14:31:32.802378 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.802356 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" Apr 17 14:31:32.805308 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.805260 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-zc9dl\"" Apr 17 14:31:32.808170 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.808148 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xfj7r"] Apr 17 14:31:32.848375 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.848348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwc5w\" (UniqueName: \"kubernetes.io/projected/a05ff934-1617-4beb-92ba-ab42117a2d7e-kube-api-access-jwc5w\") pod \"authorino-f99f4b5cd-xfj7r\" (UID: \"a05ff934-1617-4beb-92ba-ab42117a2d7e\") " pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" Apr 17 14:31:32.948741 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.948713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwc5w\" (UniqueName: \"kubernetes.io/projected/a05ff934-1617-4beb-92ba-ab42117a2d7e-kube-api-access-jwc5w\") pod \"authorino-f99f4b5cd-xfj7r\" (UID: \"a05ff934-1617-4beb-92ba-ab42117a2d7e\") " pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" Apr 17 14:31:32.956674 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:32.956649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwc5w\" (UniqueName: \"kubernetes.io/projected/a05ff934-1617-4beb-92ba-ab42117a2d7e-kube-api-access-jwc5w\") pod \"authorino-f99f4b5cd-xfj7r\" (UID: \"a05ff934-1617-4beb-92ba-ab42117a2d7e\") " pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" Apr 17 14:31:33.112421 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:33.112391 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" Apr 17 14:31:33.267242 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:33.267212 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xfj7r"] Apr 17 14:31:33.268809 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:31:33.268781 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda05ff934_1617_4beb_92ba_ab42117a2d7e.slice/crio-66d7bdc110668ddf8250b2df93475e84a92408e23b44749bc9abc9a88251e7a8 WatchSource:0}: Error finding container 66d7bdc110668ddf8250b2df93475e84a92408e23b44749bc9abc9a88251e7a8: Status 404 returned error can't find the container with id 66d7bdc110668ddf8250b2df93475e84a92408e23b44749bc9abc9a88251e7a8 Apr 17 14:31:33.327353 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:33.327312 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" event={"ID":"a05ff934-1617-4beb-92ba-ab42117a2d7e","Type":"ContainerStarted","Data":"66d7bdc110668ddf8250b2df93475e84a92408e23b44749bc9abc9a88251e7a8"} Apr 17 14:31:33.328735 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:33.328704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" event={"ID":"b931b1a7-77aa-4ff0-afc4-28da274ebeff","Type":"ContainerStarted","Data":"accb048ead8b5ff11c96efd520bbb539158633d0317500a33889dd4dbd93ee19"} Apr 17 14:31:36.342956 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:36.342919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" event={"ID":"b931b1a7-77aa-4ff0-afc4-28da274ebeff","Type":"ContainerStarted","Data":"5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752"} Apr 17 14:31:36.343400 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:36.342970 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:36.344192 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:36.344170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" event={"ID":"a05ff934-1617-4beb-92ba-ab42117a2d7e","Type":"ContainerStarted","Data":"6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7"} Apr 17 14:31:36.360524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:36.360480 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" podStartSLOduration=1.644784172 podStartE2EDuration="5.360468333s" podCreationTimestamp="2026-04-17 14:31:31 +0000 UTC" firstStartedPulling="2026-04-17 14:31:32.42672619 +0000 UTC m=+647.562733716" lastFinishedPulling="2026-04-17 14:31:36.142410348 +0000 UTC m=+651.278417877" observedRunningTime="2026-04-17 14:31:36.358100818 +0000 UTC m=+651.494108426" watchObservedRunningTime="2026-04-17 14:31:36.360468333 +0000 UTC m=+651.496475881" Apr 17 14:31:36.372592 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:36.372555 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" podStartSLOduration=1.498445717 podStartE2EDuration="4.372530836s" podCreationTimestamp="2026-04-17 14:31:32 +0000 UTC" firstStartedPulling="2026-04-17 14:31:33.270809936 +0000 UTC m=+648.406817466" lastFinishedPulling="2026-04-17 14:31:36.144895058 +0000 UTC m=+651.280902585" observedRunningTime="2026-04-17 14:31:36.371765748 +0000 UTC m=+651.507773297" watchObservedRunningTime="2026-04-17 14:31:36.372530836 +0000 UTC m=+651.508538383" Apr 17 14:31:36.623437 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:36.623408 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xfj7r"] Apr 17 14:31:38.351688 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:38.351652 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" podUID="a05ff934-1617-4beb-92ba-ab42117a2d7e" containerName="authorino" containerID="cri-o://6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7" gracePeriod=30 Apr 17 14:31:38.583016 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:38.582993 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" Apr 17 14:31:38.695886 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:38.695814 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwc5w\" (UniqueName: \"kubernetes.io/projected/a05ff934-1617-4beb-92ba-ab42117a2d7e-kube-api-access-jwc5w\") pod \"a05ff934-1617-4beb-92ba-ab42117a2d7e\" (UID: \"a05ff934-1617-4beb-92ba-ab42117a2d7e\") " Apr 17 14:31:38.697616 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:38.697591 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05ff934-1617-4beb-92ba-ab42117a2d7e-kube-api-access-jwc5w" (OuterVolumeSpecName: "kube-api-access-jwc5w") pod "a05ff934-1617-4beb-92ba-ab42117a2d7e" (UID: "a05ff934-1617-4beb-92ba-ab42117a2d7e"). InnerVolumeSpecName "kube-api-access-jwc5w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:31:38.796463 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:38.796424 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwc5w\" (UniqueName: \"kubernetes.io/projected/a05ff934-1617-4beb-92ba-ab42117a2d7e-kube-api-access-jwc5w\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:31:39.356058 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.356026 2577 generic.go:358] "Generic (PLEG): container finished" podID="a05ff934-1617-4beb-92ba-ab42117a2d7e" containerID="6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7" exitCode=0 Apr 17 14:31:39.356487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.356076 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" Apr 17 14:31:39.356487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.356115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" event={"ID":"a05ff934-1617-4beb-92ba-ab42117a2d7e","Type":"ContainerDied","Data":"6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7"} Apr 17 14:31:39.356487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.356154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xfj7r" event={"ID":"a05ff934-1617-4beb-92ba-ab42117a2d7e","Type":"ContainerDied","Data":"66d7bdc110668ddf8250b2df93475e84a92408e23b44749bc9abc9a88251e7a8"} Apr 17 14:31:39.356487 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.356170 2577 scope.go:117] "RemoveContainer" containerID="6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7" Apr 17 14:31:39.364179 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.364164 2577 scope.go:117] "RemoveContainer" containerID="6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7" Apr 17 14:31:39.364418 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:31:39.364398 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7\": container with ID starting with 6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7 not found: ID does not exist" containerID="6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7" Apr 17 14:31:39.364470 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.364425 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7"} err="failed to get container status \"6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7\": rpc error: code = NotFound desc = could not find container \"6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7\": container with ID starting with 6137ca535f9db01d92a93adcfa3af723a09d5b31bd3a82075c67adf4219e21f7 not found: ID does not exist" Apr 17 14:31:39.377392 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.377371 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xfj7r"] Apr 17 14:31:39.381186 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:39.381167 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xfj7r"] Apr 17 14:31:41.372394 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:41.372363 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05ff934-1617-4beb-92ba-ab42117a2d7e" path="/var/lib/kubelet/pods/a05ff934-1617-4beb-92ba-ab42117a2d7e/volumes" Apr 17 14:31:47.349678 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:47.349651 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:47.632928 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:47.632847 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shw62"] Apr 17 14:31:47.633109 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:47.633084 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" podUID="b931b1a7-77aa-4ff0-afc4-28da274ebeff" containerName="limitador" containerID="cri-o://5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752" gracePeriod=30 Apr 17 14:31:48.162503 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.162481 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:48.264647 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.264582 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6b7n\" (UniqueName: \"kubernetes.io/projected/b931b1a7-77aa-4ff0-afc4-28da274ebeff-kube-api-access-g6b7n\") pod \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " Apr 17 14:31:48.264647 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.264621 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b931b1a7-77aa-4ff0-afc4-28da274ebeff-config-file\") pod \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\" (UID: \"b931b1a7-77aa-4ff0-afc4-28da274ebeff\") " Apr 17 14:31:48.264974 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.264947 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b931b1a7-77aa-4ff0-afc4-28da274ebeff-config-file" (OuterVolumeSpecName: "config-file") pod "b931b1a7-77aa-4ff0-afc4-28da274ebeff" (UID: "b931b1a7-77aa-4ff0-afc4-28da274ebeff"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:31:48.266551 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.266525 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b931b1a7-77aa-4ff0-afc4-28da274ebeff-kube-api-access-g6b7n" (OuterVolumeSpecName: "kube-api-access-g6b7n") pod "b931b1a7-77aa-4ff0-afc4-28da274ebeff" (UID: "b931b1a7-77aa-4ff0-afc4-28da274ebeff"). InnerVolumeSpecName "kube-api-access-g6b7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:31:48.365718 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.365689 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g6b7n\" (UniqueName: \"kubernetes.io/projected/b931b1a7-77aa-4ff0-afc4-28da274ebeff-kube-api-access-g6b7n\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:31:48.365718 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.365715 2577 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b931b1a7-77aa-4ff0-afc4-28da274ebeff-config-file\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:31:48.391413 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.391387 2577 generic.go:358] "Generic (PLEG): container finished" podID="b931b1a7-77aa-4ff0-afc4-28da274ebeff" containerID="5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752" exitCode=0 Apr 17 14:31:48.391552 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.391438 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" event={"ID":"b931b1a7-77aa-4ff0-afc4-28da274ebeff","Type":"ContainerDied","Data":"5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752"} Apr 17 14:31:48.391552 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.391445 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" Apr 17 14:31:48.391552 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.391457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-shw62" event={"ID":"b931b1a7-77aa-4ff0-afc4-28da274ebeff","Type":"ContainerDied","Data":"accb048ead8b5ff11c96efd520bbb539158633d0317500a33889dd4dbd93ee19"} Apr 17 14:31:48.391552 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.391471 2577 scope.go:117] "RemoveContainer" containerID="5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752" Apr 17 14:31:48.399661 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.399645 2577 scope.go:117] "RemoveContainer" containerID="5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752" Apr 17 14:31:48.399874 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:31:48.399854 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752\": container with ID starting with 5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752 not found: ID does not exist" containerID="5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752" Apr 17 14:31:48.399916 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.399883 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752"} err="failed to get container status \"5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752\": rpc error: code = NotFound desc = could not find container \"5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752\": container with ID starting with 5de8d05fbf09759bf364a2e1e1da90eff14dafaa37afd54bb277506b10dee752 not found: ID does not exist" Apr 17 14:31:48.411680 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.411660 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shw62"] Apr 17 14:31:48.414252 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:48.414233 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-shw62"] Apr 17 14:31:49.372433 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:49.372400 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b931b1a7-77aa-4ff0-afc4-28da274ebeff" path="/var/lib/kubelet/pods/b931b1a7-77aa-4ff0-afc4-28da274ebeff/volumes" Apr 17 14:31:53.134401 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.134368 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-q5sn2"] Apr 17 14:31:53.134900 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.134731 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b931b1a7-77aa-4ff0-afc4-28da274ebeff" containerName="limitador" Apr 17 14:31:53.134900 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.134747 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b931b1a7-77aa-4ff0-afc4-28da274ebeff" containerName="limitador" Apr 17 14:31:53.134900 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.134765 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a05ff934-1617-4beb-92ba-ab42117a2d7e" containerName="authorino" Apr 17 14:31:53.134900 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.134773 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05ff934-1617-4beb-92ba-ab42117a2d7e" containerName="authorino" Apr 17 14:31:53.134900 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.134854 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b931b1a7-77aa-4ff0-afc4-28da274ebeff" containerName="limitador" Apr 17 14:31:53.134900 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.134865 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a05ff934-1617-4beb-92ba-ab42117a2d7e" containerName="authorino" Apr 17 14:31:53.139563 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.139542 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.142596 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.142577 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-qzv5h\"" Apr 17 14:31:53.142708 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.142577 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 14:31:53.147614 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.147592 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-q5sn2"] Apr 17 14:31:53.303205 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.303177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6755262-ff4c-4646-bd76-ff4c39ca25ca-data\") pod \"postgres-868db5846d-q5sn2\" (UID: \"a6755262-ff4c-4646-bd76-ff4c39ca25ca\") " pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.303367 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.303215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkh8w\" (UniqueName: \"kubernetes.io/projected/a6755262-ff4c-4646-bd76-ff4c39ca25ca-kube-api-access-fkh8w\") pod \"postgres-868db5846d-q5sn2\" (UID: \"a6755262-ff4c-4646-bd76-ff4c39ca25ca\") " pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.403946 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.403874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkh8w\" (UniqueName: \"kubernetes.io/projected/a6755262-ff4c-4646-bd76-ff4c39ca25ca-kube-api-access-fkh8w\") pod \"postgres-868db5846d-q5sn2\" (UID: \"a6755262-ff4c-4646-bd76-ff4c39ca25ca\") " pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.404090 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.403945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6755262-ff4c-4646-bd76-ff4c39ca25ca-data\") pod \"postgres-868db5846d-q5sn2\" (UID: \"a6755262-ff4c-4646-bd76-ff4c39ca25ca\") " pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.404242 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.404227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a6755262-ff4c-4646-bd76-ff4c39ca25ca-data\") pod \"postgres-868db5846d-q5sn2\" (UID: \"a6755262-ff4c-4646-bd76-ff4c39ca25ca\") " pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.412630 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.412606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkh8w\" (UniqueName: \"kubernetes.io/projected/a6755262-ff4c-4646-bd76-ff4c39ca25ca-kube-api-access-fkh8w\") pod \"postgres-868db5846d-q5sn2\" (UID: \"a6755262-ff4c-4646-bd76-ff4c39ca25ca\") " pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.451083 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.451062 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:53.771327 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:53.771301 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-q5sn2"] Apr 17 14:31:53.773083 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:31:53.773057 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6755262_ff4c_4646_bd76_ff4c39ca25ca.slice/crio-2c888757436ee1284e1175520049d13c95d1b3dddf2d815922f11ce8f21eb460 WatchSource:0}: Error finding container 2c888757436ee1284e1175520049d13c95d1b3dddf2d815922f11ce8f21eb460: Status 404 returned error can't find the container with id 2c888757436ee1284e1175520049d13c95d1b3dddf2d815922f11ce8f21eb460 Apr 17 14:31:54.409784 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:54.409740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-q5sn2" event={"ID":"a6755262-ff4c-4646-bd76-ff4c39ca25ca","Type":"ContainerStarted","Data":"2c888757436ee1284e1175520049d13c95d1b3dddf2d815922f11ce8f21eb460"} Apr 17 14:31:59.429429 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:59.429394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-q5sn2" event={"ID":"a6755262-ff4c-4646-bd76-ff4c39ca25ca","Type":"ContainerStarted","Data":"6d39b389205f3d5eee3e6b2573fa26f04580de33ad843b7296d54a5141054d5d"} Apr 17 14:31:59.429889 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:59.429505 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:31:59.445691 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:31:59.445649 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-q5sn2" podStartSLOduration=1.2676454160000001 podStartE2EDuration="6.445639094s" podCreationTimestamp="2026-04-17 14:31:53 +0000 UTC" firstStartedPulling="2026-04-17 14:31:53.774664774 +0000 UTC m=+668.910672300" lastFinishedPulling="2026-04-17 14:31:58.952658434 +0000 UTC m=+674.088665978" observedRunningTime="2026-04-17 14:31:59.444761368 +0000 UTC m=+674.580768920" watchObservedRunningTime="2026-04-17 14:31:59.445639094 +0000 UTC m=+674.581646640" Apr 17 14:32:05.461946 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:05.461920 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-q5sn2" Apr 17 14:32:06.280224 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.280190 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7df96b9c7b-8tzt9"] Apr 17 14:32:06.283394 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.283378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:06.286145 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.286126 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 14:32:06.286256 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.286157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-lq9r8\"" Apr 17 14:32:06.286413 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.286399 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 14:32:06.292731 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.292709 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7df96b9c7b-8tzt9"] Apr 17 14:32:06.302309 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.302288 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7f8967969b-59rh4"] Apr 17 14:32:06.305228 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.305210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:06.308867 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.308852 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-48j2d\"" Apr 17 14:32:06.312968 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.312949 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f8967969b-59rh4"] Apr 17 14:32:06.406553 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.406528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vk88\" (UniqueName: \"kubernetes.io/projected/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-kube-api-access-2vk88\") pod \"maas-api-7df96b9c7b-8tzt9\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:06.406682 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.406570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls\") pod \"maas-api-7df96b9c7b-8tzt9\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:06.406682 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.406597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7gn\" (UniqueName: \"kubernetes.io/projected/7ab2e884-ff35-41df-a014-0878b5d523b5-kube-api-access-rd7gn\") pod \"maas-controller-7f8967969b-59rh4\" (UID: \"7ab2e884-ff35-41df-a014-0878b5d523b5\") " pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:06.506994 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.506960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vk88\" (UniqueName: \"kubernetes.io/projected/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-kube-api-access-2vk88\") pod \"maas-api-7df96b9c7b-8tzt9\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:06.507430 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.507026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls\") pod \"maas-api-7df96b9c7b-8tzt9\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:06.507430 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.507080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7gn\" (UniqueName: \"kubernetes.io/projected/7ab2e884-ff35-41df-a014-0878b5d523b5-kube-api-access-rd7gn\") pod \"maas-controller-7f8967969b-59rh4\" (UID: \"7ab2e884-ff35-41df-a014-0878b5d523b5\") " pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:06.507430 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:32:06.507201 2577 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 17 14:32:06.507430 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:32:06.507292 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls podName:7489e4cd-f69e-4e7f-8c2e-77f67f36af78 nodeName:}" failed. No retries permitted until 2026-04-17 14:32:07.007259428 +0000 UTC m=+682.143266957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls") pod "maas-api-7df96b9c7b-8tzt9" (UID: "7489e4cd-f69e-4e7f-8c2e-77f67f36af78") : secret "maas-api-serving-cert" not found Apr 17 14:32:06.517603 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.517575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7gn\" (UniqueName: \"kubernetes.io/projected/7ab2e884-ff35-41df-a014-0878b5d523b5-kube-api-access-rd7gn\") pod \"maas-controller-7f8967969b-59rh4\" (UID: \"7ab2e884-ff35-41df-a014-0878b5d523b5\") " pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:06.517744 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.517572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vk88\" (UniqueName: \"kubernetes.io/projected/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-kube-api-access-2vk88\") pod \"maas-api-7df96b9c7b-8tzt9\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:06.614768 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.614736 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:06.726144 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:06.726119 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f8967969b-59rh4"] Apr 17 14:32:06.727861 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:32:06.727834 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ab2e884_ff35_41df_a014_0878b5d523b5.slice/crio-41945276a07b45773efe9f3e1d863c3168a70d710bca6c34eb1be58f7ddd67fb WatchSource:0}: Error finding container 41945276a07b45773efe9f3e1d863c3168a70d710bca6c34eb1be58f7ddd67fb: Status 404 returned error can't find the container with id 41945276a07b45773efe9f3e1d863c3168a70d710bca6c34eb1be58f7ddd67fb Apr 17 14:32:07.009452 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.009368 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pm9x7"] Apr 17 14:32:07.012468 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.012438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls\") pod \"maas-api-7df96b9c7b-8tzt9\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:07.014141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.014120 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" Apr 17 14:32:07.014868 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.014848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls\") pod \"maas-api-7df96b9c7b-8tzt9\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:07.017109 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.017057 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-zc9dl\"" Apr 17 14:32:07.018318 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.018296 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pm9x7"] Apr 17 14:32:07.113438 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.113406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td6xb\" (UniqueName: \"kubernetes.io/projected/d965af3f-a965-4424-8d1f-c5eabbbaa181-kube-api-access-td6xb\") pod \"authorino-8b475cf9f-pm9x7\" (UID: \"d965af3f-a965-4424-8d1f-c5eabbbaa181\") " pod="kuadrant-system/authorino-8b475cf9f-pm9x7" Apr 17 14:32:07.193236 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.193204 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:07.214255 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.214229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td6xb\" (UniqueName: \"kubernetes.io/projected/d965af3f-a965-4424-8d1f-c5eabbbaa181-kube-api-access-td6xb\") pod \"authorino-8b475cf9f-pm9x7\" (UID: \"d965af3f-a965-4424-8d1f-c5eabbbaa181\") " pod="kuadrant-system/authorino-8b475cf9f-pm9x7" Apr 17 14:32:07.226915 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.226887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td6xb\" (UniqueName: \"kubernetes.io/projected/d965af3f-a965-4424-8d1f-c5eabbbaa181-kube-api-access-td6xb\") pod \"authorino-8b475cf9f-pm9x7\" (UID: \"d965af3f-a965-4424-8d1f-c5eabbbaa181\") " pod="kuadrant-system/authorino-8b475cf9f-pm9x7" Apr 17 14:32:07.236178 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.236146 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pm9x7"] Apr 17 14:32:07.236386 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.236374 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" Apr 17 14:32:07.259831 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.259755 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-74747b4575-mjgkc"] Apr 17 14:32:07.264149 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.264124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74747b4575-mjgkc" Apr 17 14:32:07.271864 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.271577 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-74747b4575-mjgkc"] Apr 17 14:32:07.322828 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.322784 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-74747b4575-mjgkc"] Apr 17 14:32:07.323265 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:32:07.323229 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xmf5h], unattached volumes=[], failed to process volumes=[kube-api-access-xmf5h]: context canceled" pod="kuadrant-system/authorino-74747b4575-mjgkc" podUID="012b4160-ef8d-4937-8803-2c20c51d9f42" Apr 17 14:32:07.325644 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.325621 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7df96b9c7b-8tzt9"] Apr 17 14:32:07.328028 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:32:07.327616 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7489e4cd_f69e_4e7f_8c2e_77f67f36af78.slice/crio-d56f54cc09a197644ea5776f481488a65ed68f74aaa466e0f8a0a66ab53fe9ad WatchSource:0}: Error finding container d56f54cc09a197644ea5776f481488a65ed68f74aaa466e0f8a0a66ab53fe9ad: Status 404 returned error can't find the container with id d56f54cc09a197644ea5776f481488a65ed68f74aaa466e0f8a0a66ab53fe9ad Apr 17 14:32:07.348524 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.348488 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68d7c88d47-jr99f"] Apr 17 14:32:07.352059 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.352030 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.354823 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.354635 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 14:32:07.357903 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.357869 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68d7c88d47-jr99f"] Apr 17 14:32:07.374750 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.374724 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pm9x7"] Apr 17 14:32:07.376670 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:32:07.376645 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd965af3f_a965_4424_8d1f_c5eabbbaa181.slice/crio-5656b7c11c8ff55bbb950f521ada96f1b2c5805df3bf147f06d885c1a3b1d4a5 WatchSource:0}: Error finding container 5656b7c11c8ff55bbb950f521ada96f1b2c5805df3bf147f06d885c1a3b1d4a5: Status 404 returned error can't find the container with id 5656b7c11c8ff55bbb950f521ada96f1b2c5805df3bf147f06d885c1a3b1d4a5 Apr 17 14:32:07.415352 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.415320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmf5h\" (UniqueName: \"kubernetes.io/projected/012b4160-ef8d-4937-8803-2c20c51d9f42-kube-api-access-xmf5h\") pod \"authorino-74747b4575-mjgkc\" (UID: \"012b4160-ef8d-4937-8803-2c20c51d9f42\") " pod="kuadrant-system/authorino-74747b4575-mjgkc" Apr 17 14:32:07.456561 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.456515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" event={"ID":"7489e4cd-f69e-4e7f-8c2e-77f67f36af78","Type":"ContainerStarted","Data":"d56f54cc09a197644ea5776f481488a65ed68f74aaa466e0f8a0a66ab53fe9ad"} Apr 17 14:32:07.457960 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.457930 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8967969b-59rh4" event={"ID":"7ab2e884-ff35-41df-a014-0878b5d523b5","Type":"ContainerStarted","Data":"41945276a07b45773efe9f3e1d863c3168a70d710bca6c34eb1be58f7ddd67fb"} Apr 17 14:32:07.459350 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.459308 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" event={"ID":"d965af3f-a965-4424-8d1f-c5eabbbaa181","Type":"ContainerStarted","Data":"5656b7c11c8ff55bbb950f521ada96f1b2c5805df3bf147f06d885c1a3b1d4a5"} Apr 17 14:32:07.459350 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.459334 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74747b4575-mjgkc" Apr 17 14:32:07.465383 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.465067 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74747b4575-mjgkc" Apr 17 14:32:07.516303 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.516032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlt86\" (UniqueName: \"kubernetes.io/projected/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-kube-api-access-xlt86\") pod \"authorino-68d7c88d47-jr99f\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.516303 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.516095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmf5h\" (UniqueName: \"kubernetes.io/projected/012b4160-ef8d-4937-8803-2c20c51d9f42-kube-api-access-xmf5h\") pod \"authorino-74747b4575-mjgkc\" (UID: \"012b4160-ef8d-4937-8803-2c20c51d9f42\") " pod="kuadrant-system/authorino-74747b4575-mjgkc" Apr 17 14:32:07.516303 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.516141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-tls-cert\") pod \"authorino-68d7c88d47-jr99f\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.524858 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.524832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmf5h\" (UniqueName: \"kubernetes.io/projected/012b4160-ef8d-4937-8803-2c20c51d9f42-kube-api-access-xmf5h\") pod \"authorino-74747b4575-mjgkc\" (UID: \"012b4160-ef8d-4937-8803-2c20c51d9f42\") " pod="kuadrant-system/authorino-74747b4575-mjgkc" Apr 17 14:32:07.617427 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.617393 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmf5h\" (UniqueName: \"kubernetes.io/projected/012b4160-ef8d-4937-8803-2c20c51d9f42-kube-api-access-xmf5h\") pod \"012b4160-ef8d-4937-8803-2c20c51d9f42\" (UID: \"012b4160-ef8d-4937-8803-2c20c51d9f42\") " Apr 17 14:32:07.617610 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.617540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-tls-cert\") pod \"authorino-68d7c88d47-jr99f\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.617675 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.617643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlt86\" (UniqueName: \"kubernetes.io/projected/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-kube-api-access-xlt86\") pod \"authorino-68d7c88d47-jr99f\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.620433 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.620400 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012b4160-ef8d-4937-8803-2c20c51d9f42-kube-api-access-xmf5h" (OuterVolumeSpecName: "kube-api-access-xmf5h") pod "012b4160-ef8d-4937-8803-2c20c51d9f42" (UID: "012b4160-ef8d-4937-8803-2c20c51d9f42"). InnerVolumeSpecName "kube-api-access-xmf5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:32:07.621005 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.620954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-tls-cert\") pod \"authorino-68d7c88d47-jr99f\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.629047 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.629025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlt86\" (UniqueName: \"kubernetes.io/projected/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-kube-api-access-xlt86\") pod \"authorino-68d7c88d47-jr99f\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.664030 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.663717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:32:07.718367 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.718323 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xmf5h\" (UniqueName: \"kubernetes.io/projected/012b4160-ef8d-4937-8803-2c20c51d9f42-kube-api-access-xmf5h\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:32:07.833139 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:07.833112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68d7c88d47-jr99f"] Apr 17 14:32:07.833783 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:32:07.833754 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac2c6c64_9cc6_4169_8a77_73a10d06e46b.slice/crio-c08617b91e9409f1c0204aca958507855bcc85d09688fee6c03252df9a749f9f WatchSource:0}: Error finding container c08617b91e9409f1c0204aca958507855bcc85d09688fee6c03252df9a749f9f: Status 404 returned error can't find the container with id c08617b91e9409f1c0204aca958507855bcc85d09688fee6c03252df9a749f9f Apr 17 14:32:08.467604 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.467516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" event={"ID":"d965af3f-a965-4424-8d1f-c5eabbbaa181","Type":"ContainerStarted","Data":"f4cacd81b7b507d6a7423300ebf993847034649e9f1e712d12171eb6db2266c1"} Apr 17 14:32:08.467807 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.467684 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" podUID="d965af3f-a965-4424-8d1f-c5eabbbaa181" containerName="authorino" containerID="cri-o://f4cacd81b7b507d6a7423300ebf993847034649e9f1e712d12171eb6db2266c1" gracePeriod=30 Apr 17 14:32:08.470617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.470456 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74747b4575-mjgkc" Apr 17 14:32:08.470617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.470492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68d7c88d47-jr99f" event={"ID":"ac2c6c64-9cc6-4169-8a77-73a10d06e46b","Type":"ContainerStarted","Data":"4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913"} Apr 17 14:32:08.470617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.470514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68d7c88d47-jr99f" event={"ID":"ac2c6c64-9cc6-4169-8a77-73a10d06e46b","Type":"ContainerStarted","Data":"c08617b91e9409f1c0204aca958507855bcc85d09688fee6c03252df9a749f9f"} Apr 17 14:32:08.488706 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.487959 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" podStartSLOduration=2.069343611 podStartE2EDuration="2.487939801s" podCreationTimestamp="2026-04-17 14:32:06 +0000 UTC" firstStartedPulling="2026-04-17 14:32:07.378133976 +0000 UTC m=+682.514141503" lastFinishedPulling="2026-04-17 14:32:07.79673015 +0000 UTC m=+682.932737693" observedRunningTime="2026-04-17 14:32:08.484391649 +0000 UTC m=+683.620399198" watchObservedRunningTime="2026-04-17 14:32:08.487939801 +0000 UTC m=+683.623947364" Apr 17 14:32:08.514740 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.514699 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-74747b4575-mjgkc"] Apr 17 14:32:08.517782 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.517747 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-74747b4575-mjgkc"] Apr 17 14:32:08.530386 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:08.528680 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68d7c88d47-jr99f" podStartSLOduration=1.2018634719999999 podStartE2EDuration="1.528658954s" podCreationTimestamp="2026-04-17 14:32:07 +0000 UTC" firstStartedPulling="2026-04-17 14:32:07.858602675 +0000 UTC m=+682.994610207" lastFinishedPulling="2026-04-17 14:32:08.185398155 +0000 UTC m=+683.321405689" observedRunningTime="2026-04-17 14:32:08.526623994 +0000 UTC m=+683.662631542" watchObservedRunningTime="2026-04-17 14:32:08.528658954 +0000 UTC m=+683.664666503" Apr 17 14:32:09.374495 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:09.374448 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="012b4160-ef8d-4937-8803-2c20c51d9f42" path="/var/lib/kubelet/pods/012b4160-ef8d-4937-8803-2c20c51d9f42/volumes" Apr 17 14:32:09.480936 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:09.480897 2577 generic.go:358] "Generic (PLEG): container finished" podID="d965af3f-a965-4424-8d1f-c5eabbbaa181" containerID="f4cacd81b7b507d6a7423300ebf993847034649e9f1e712d12171eb6db2266c1" exitCode=0 Apr 17 14:32:09.481113 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:09.480970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" event={"ID":"d965af3f-a965-4424-8d1f-c5eabbbaa181","Type":"ContainerDied","Data":"f4cacd81b7b507d6a7423300ebf993847034649e9f1e712d12171eb6db2266c1"} Apr 17 14:32:10.189578 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.189554 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" Apr 17 14:32:10.362973 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.362934 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td6xb\" (UniqueName: \"kubernetes.io/projected/d965af3f-a965-4424-8d1f-c5eabbbaa181-kube-api-access-td6xb\") pod \"d965af3f-a965-4424-8d1f-c5eabbbaa181\" (UID: \"d965af3f-a965-4424-8d1f-c5eabbbaa181\") " Apr 17 14:32:10.365139 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.365106 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d965af3f-a965-4424-8d1f-c5eabbbaa181-kube-api-access-td6xb" (OuterVolumeSpecName: "kube-api-access-td6xb") pod "d965af3f-a965-4424-8d1f-c5eabbbaa181" (UID: "d965af3f-a965-4424-8d1f-c5eabbbaa181"). InnerVolumeSpecName "kube-api-access-td6xb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:32:10.464558 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.464495 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-td6xb\" (UniqueName: \"kubernetes.io/projected/d965af3f-a965-4424-8d1f-c5eabbbaa181-kube-api-access-td6xb\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:32:10.485650 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.485615 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" event={"ID":"7489e4cd-f69e-4e7f-8c2e-77f67f36af78","Type":"ContainerStarted","Data":"9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563"} Apr 17 14:32:10.485868 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.485707 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:10.487030 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.487006 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8967969b-59rh4" event={"ID":"7ab2e884-ff35-41df-a014-0878b5d523b5","Type":"ContainerStarted","Data":"51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306"} Apr 17 14:32:10.487144 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.487126 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:10.488119 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.488101 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" event={"ID":"d965af3f-a965-4424-8d1f-c5eabbbaa181","Type":"ContainerDied","Data":"5656b7c11c8ff55bbb950f521ada96f1b2c5805df3bf147f06d885c1a3b1d4a5"} Apr 17 14:32:10.488212 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.488124 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-pm9x7" Apr 17 14:32:10.488212 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.488127 2577 scope.go:117] "RemoveContainer" containerID="f4cacd81b7b507d6a7423300ebf993847034649e9f1e712d12171eb6db2266c1" Apr 17 14:32:10.505856 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.505819 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" podStartSLOduration=1.6263889059999999 podStartE2EDuration="4.505808053s" podCreationTimestamp="2026-04-17 14:32:06 +0000 UTC" firstStartedPulling="2026-04-17 14:32:07.329307349 +0000 UTC m=+682.465314883" lastFinishedPulling="2026-04-17 14:32:10.208726499 +0000 UTC m=+685.344734030" observedRunningTime="2026-04-17 14:32:10.503294794 +0000 UTC m=+685.639302342" watchObservedRunningTime="2026-04-17 14:32:10.505808053 +0000 UTC m=+685.641815600" Apr 17 14:32:10.520500 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.520460 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7f8967969b-59rh4" podStartSLOduration=1.03722276 podStartE2EDuration="4.520449971s" podCreationTimestamp="2026-04-17 14:32:06 +0000 UTC" firstStartedPulling="2026-04-17 14:32:06.729041756 +0000 UTC m=+681.865049282" lastFinishedPulling="2026-04-17 14:32:10.212268959 +0000 UTC m=+685.348276493" observedRunningTime="2026-04-17 14:32:10.51867126 +0000 UTC m=+685.654678828" watchObservedRunningTime="2026-04-17 14:32:10.520449971 +0000 UTC m=+685.656457562" Apr 17 14:32:10.530685 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.530665 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pm9x7"] Apr 17 14:32:10.538584 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:10.538564 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-pm9x7"] Apr 17 14:32:11.372327 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:11.372297 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d965af3f-a965-4424-8d1f-c5eabbbaa181" path="/var/lib/kubelet/pods/d965af3f-a965-4424-8d1f-c5eabbbaa181/volumes" Apr 17 14:32:16.498097 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:16.498068 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:16.762061 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:16.761965 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7df96b9c7b-8tzt9"] Apr 17 14:32:16.762320 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:16.762252 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" podUID="7489e4cd-f69e-4e7f-8c2e-77f67f36af78" containerName="maas-api" containerID="cri-o://9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563" gracePeriod=30 Apr 17 14:32:16.998700 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:16.998678 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:17.017587 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.017532 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls\") pod \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " Apr 17 14:32:17.017587 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.017573 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vk88\" (UniqueName: \"kubernetes.io/projected/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-kube-api-access-2vk88\") pod \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\" (UID: \"7489e4cd-f69e-4e7f-8c2e-77f67f36af78\") " Apr 17 14:32:17.019728 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.019698 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "7489e4cd-f69e-4e7f-8c2e-77f67f36af78" (UID: "7489e4cd-f69e-4e7f-8c2e-77f67f36af78"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:32:17.019942 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.019914 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-kube-api-access-2vk88" (OuterVolumeSpecName: "kube-api-access-2vk88") pod "7489e4cd-f69e-4e7f-8c2e-77f67f36af78" (UID: "7489e4cd-f69e-4e7f-8c2e-77f67f36af78"). InnerVolumeSpecName "kube-api-access-2vk88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:32:17.118658 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.118630 2577 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-maas-api-tls\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:32:17.118658 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.118654 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vk88\" (UniqueName: \"kubernetes.io/projected/7489e4cd-f69e-4e7f-8c2e-77f67f36af78-kube-api-access-2vk88\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:32:17.512751 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.512712 2577 generic.go:358] "Generic (PLEG): container finished" podID="7489e4cd-f69e-4e7f-8c2e-77f67f36af78" containerID="9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563" exitCode=0 Apr 17 14:32:17.513149 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.512761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" event={"ID":"7489e4cd-f69e-4e7f-8c2e-77f67f36af78","Type":"ContainerDied","Data":"9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563"} Apr 17 14:32:17.513149 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.512783 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" event={"ID":"7489e4cd-f69e-4e7f-8c2e-77f67f36af78","Type":"ContainerDied","Data":"d56f54cc09a197644ea5776f481488a65ed68f74aaa466e0f8a0a66ab53fe9ad"} Apr 17 14:32:17.513149 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.512780 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7df96b9c7b-8tzt9" Apr 17 14:32:17.513149 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.512846 2577 scope.go:117] "RemoveContainer" containerID="9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563" Apr 17 14:32:17.520699 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.520684 2577 scope.go:117] "RemoveContainer" containerID="9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563" Apr 17 14:32:17.520955 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:32:17.520935 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563\": container with ID starting with 9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563 not found: ID does not exist" containerID="9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563" Apr 17 14:32:17.521010 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.520963 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563"} err="failed to get container status \"9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563\": rpc error: code = NotFound desc = could not find container \"9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563\": container with ID starting with 9514c21e3fe2a8786eee1fb1ea2f9efb5df7330c537a1d891298bcf1a837c563 not found: ID does not exist" Apr 17 14:32:17.529457 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.529433 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7df96b9c7b-8tzt9"] Apr 17 14:32:17.532714 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:17.532693 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7df96b9c7b-8tzt9"] Apr 17 14:32:19.372200 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:19.372165 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7489e4cd-f69e-4e7f-8c2e-77f67f36af78" path="/var/lib/kubelet/pods/7489e4cd-f69e-4e7f-8c2e-77f67f36af78/volumes" Apr 17 14:32:21.497349 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.497322 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:21.781317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.778763 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-57b87ccd85-4kjbl"] Apr 17 14:32:21.781317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.779396 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d965af3f-a965-4424-8d1f-c5eabbbaa181" containerName="authorino" Apr 17 14:32:21.781317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.779415 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d965af3f-a965-4424-8d1f-c5eabbbaa181" containerName="authorino" Apr 17 14:32:21.781317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.779430 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7489e4cd-f69e-4e7f-8c2e-77f67f36af78" containerName="maas-api" Apr 17 14:32:21.781317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.779440 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7489e4cd-f69e-4e7f-8c2e-77f67f36af78" containerName="maas-api" Apr 17 14:32:21.781317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.779563 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d965af3f-a965-4424-8d1f-c5eabbbaa181" containerName="authorino" Apr 17 14:32:21.781317 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.779578 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7489e4cd-f69e-4e7f-8c2e-77f67f36af78" containerName="maas-api" Apr 17 14:32:21.784895 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.784871 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:32:21.786514 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.786493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57b87ccd85-4kjbl"] Apr 17 14:32:21.849337 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.849307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl8sx\" (UniqueName: \"kubernetes.io/projected/c9953d2d-674f-400b-86ce-077081c5b302-kube-api-access-cl8sx\") pod \"maas-controller-57b87ccd85-4kjbl\" (UID: \"c9953d2d-674f-400b-86ce-077081c5b302\") " pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:32:21.950358 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.950329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl8sx\" (UniqueName: \"kubernetes.io/projected/c9953d2d-674f-400b-86ce-077081c5b302-kube-api-access-cl8sx\") pod \"maas-controller-57b87ccd85-4kjbl\" (UID: \"c9953d2d-674f-400b-86ce-077081c5b302\") " pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:32:21.958795 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:21.958773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl8sx\" (UniqueName: \"kubernetes.io/projected/c9953d2d-674f-400b-86ce-077081c5b302-kube-api-access-cl8sx\") pod \"maas-controller-57b87ccd85-4kjbl\" (UID: \"c9953d2d-674f-400b-86ce-077081c5b302\") " pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:32:22.094830 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:22.094810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:32:22.208792 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:22.208771 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57b87ccd85-4kjbl"] Apr 17 14:32:22.210434 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:32:22.210403 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9953d2d_674f_400b_86ce_077081c5b302.slice/crio-ffc30332b677a23dc9cb9eaed8c4883a56bc6dc9aa47b75bf8dbcbf83f1ea089 WatchSource:0}: Error finding container ffc30332b677a23dc9cb9eaed8c4883a56bc6dc9aa47b75bf8dbcbf83f1ea089: Status 404 returned error can't find the container with id ffc30332b677a23dc9cb9eaed8c4883a56bc6dc9aa47b75bf8dbcbf83f1ea089 Apr 17 14:32:22.532704 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:22.532626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" event={"ID":"c9953d2d-674f-400b-86ce-077081c5b302","Type":"ContainerStarted","Data":"ffc30332b677a23dc9cb9eaed8c4883a56bc6dc9aa47b75bf8dbcbf83f1ea089"} Apr 17 14:32:23.537623 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:23.537589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" event={"ID":"c9953d2d-674f-400b-86ce-077081c5b302","Type":"ContainerStarted","Data":"66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e"} Apr 17 14:32:23.538048 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:23.537700 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:32:23.555939 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:23.555892 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" podStartSLOduration=2.230836735 podStartE2EDuration="2.555880813s" podCreationTimestamp="2026-04-17 14:32:21 +0000 UTC" firstStartedPulling="2026-04-17 14:32:22.21166958 +0000 UTC m=+697.347677109" lastFinishedPulling="2026-04-17 14:32:22.536713647 +0000 UTC m=+697.672721187" observedRunningTime="2026-04-17 14:32:23.553049618 +0000 UTC m=+698.689057168" watchObservedRunningTime="2026-04-17 14:32:23.555880813 +0000 UTC m=+698.691888360" Apr 17 14:32:34.547313 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:34.547223 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:32:34.584920 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:34.584893 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7f8967969b-59rh4"] Apr 17 14:32:34.585133 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:34.585109 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7f8967969b-59rh4" podUID="7ab2e884-ff35-41df-a014-0878b5d523b5" containerName="manager" containerID="cri-o://51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306" gracePeriod=10 Apr 17 14:32:34.824598 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:34.824577 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:34.849478 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:34.849443 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd7gn\" (UniqueName: \"kubernetes.io/projected/7ab2e884-ff35-41df-a014-0878b5d523b5-kube-api-access-rd7gn\") pod \"7ab2e884-ff35-41df-a014-0878b5d523b5\" (UID: \"7ab2e884-ff35-41df-a014-0878b5d523b5\") " Apr 17 14:32:34.851349 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:34.851325 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab2e884-ff35-41df-a014-0878b5d523b5-kube-api-access-rd7gn" (OuterVolumeSpecName: "kube-api-access-rd7gn") pod "7ab2e884-ff35-41df-a014-0878b5d523b5" (UID: "7ab2e884-ff35-41df-a014-0878b5d523b5"). InnerVolumeSpecName "kube-api-access-rd7gn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:32:34.950492 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:34.950461 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rd7gn\" (UniqueName: \"kubernetes.io/projected/7ab2e884-ff35-41df-a014-0878b5d523b5-kube-api-access-rd7gn\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:32:35.579240 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.579210 2577 generic.go:358] "Generic (PLEG): container finished" podID="7ab2e884-ff35-41df-a014-0878b5d523b5" containerID="51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306" exitCode=0 Apr 17 14:32:35.579637 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.579295 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f8967969b-59rh4" Apr 17 14:32:35.579637 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.579297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8967969b-59rh4" event={"ID":"7ab2e884-ff35-41df-a014-0878b5d523b5","Type":"ContainerDied","Data":"51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306"} Apr 17 14:32:35.579637 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.579334 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f8967969b-59rh4" event={"ID":"7ab2e884-ff35-41df-a014-0878b5d523b5","Type":"ContainerDied","Data":"41945276a07b45773efe9f3e1d863c3168a70d710bca6c34eb1be58f7ddd67fb"} Apr 17 14:32:35.579637 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.579350 2577 scope.go:117] "RemoveContainer" containerID="51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306" Apr 17 14:32:35.587391 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.587188 2577 scope.go:117] "RemoveContainer" containerID="51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306" Apr 17 14:32:35.587439 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:32:35.587418 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306\": container with ID starting with 51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306 not found: ID does not exist" containerID="51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306" Apr 17 14:32:35.587475 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.587442 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306"} err="failed to get container status \"51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306\": rpc error: code = NotFound desc = could not find container \"51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306\": container with ID starting with 51c1d07beaa509b9b8ff84ae58753b606a5bee05ebfe938f3fe67b4c288f8306 not found: ID does not exist" Apr 17 14:32:35.600890 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.600866 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7f8967969b-59rh4"] Apr 17 14:32:35.605845 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:35.605826 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7f8967969b-59rh4"] Apr 17 14:32:37.372824 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:37.372777 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ab2e884-ff35-41df-a014-0878b5d523b5" path="/var/lib/kubelet/pods/7ab2e884-ff35-41df-a014-0878b5d523b5/volumes" Apr 17 14:32:49.786035 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.785993 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b"] Apr 17 14:32:49.786424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.786324 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ab2e884-ff35-41df-a014-0878b5d523b5" containerName="manager" Apr 17 14:32:49.786424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.786337 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab2e884-ff35-41df-a014-0878b5d523b5" containerName="manager" Apr 17 14:32:49.786424 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.786397 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ab2e884-ff35-41df-a014-0878b5d523b5" containerName="manager" Apr 17 14:32:49.793232 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.793217 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.797531 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.797506 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 14:32:49.797660 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.797506 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 14:32:49.797968 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.797775 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 14:32:49.797968 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.797864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-m8hqm\"" Apr 17 14:32:49.800351 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.800330 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b"] Apr 17 14:32:49.853666 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.853638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.853776 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.853678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tvps\" (UniqueName: \"kubernetes.io/projected/cdfaf532-9147-4c9e-a080-637764031980-kube-api-access-4tvps\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.853819 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.853794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.853856 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.853821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfaf532-9147-4c9e-a080-637764031980-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.853903 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.853853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.853903 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.853875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.954766 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.954729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.954934 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.954782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tvps\" (UniqueName: \"kubernetes.io/projected/cdfaf532-9147-4c9e-a080-637764031980-kube-api-access-4tvps\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.954934 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.954838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.954934 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.954866 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfaf532-9147-4c9e-a080-637764031980-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.954934 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.954899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.954934 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.954918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.955184 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.955094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.955239 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.955196 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.955347 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.955327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.957037 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.957007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cdfaf532-9147-4c9e-a080-637764031980-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.957207 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.957188 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfaf532-9147-4c9e-a080-637764031980-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:49.962434 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:49.962415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tvps\" (UniqueName: \"kubernetes.io/projected/cdfaf532-9147-4c9e-a080-637764031980-kube-api-access-4tvps\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b\" (UID: \"cdfaf532-9147-4c9e-a080-637764031980\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:50.103781 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:50.103747 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:32:50.244394 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:50.244369 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b"] Apr 17 14:32:50.245638 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:32:50.245613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdfaf532_9147_4c9e_a080_637764031980.slice/crio-32a8e7e09164819fe11e2aa7cc5dcaa976c2137a2f7cfc8b05fb3d9f4703750b WatchSource:0}: Error finding container 32a8e7e09164819fe11e2aa7cc5dcaa976c2137a2f7cfc8b05fb3d9f4703750b: Status 404 returned error can't find the container with id 32a8e7e09164819fe11e2aa7cc5dcaa976c2137a2f7cfc8b05fb3d9f4703750b Apr 17 14:32:50.247268 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:50.247250 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:32:50.637912 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:50.637881 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" event={"ID":"cdfaf532-9147-4c9e-a080-637764031980","Type":"ContainerStarted","Data":"32a8e7e09164819fe11e2aa7cc5dcaa976c2137a2f7cfc8b05fb3d9f4703750b"} Apr 17 14:32:55.661661 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:32:55.661625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" event={"ID":"cdfaf532-9147-4c9e-a080-637764031980","Type":"ContainerStarted","Data":"81dc77b6d01389da3b4acb29433c984970456abf17a18d782a54806e1c4e9d19"} Apr 17 14:33:00.682700 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:00.682617 2577 generic.go:358] "Generic (PLEG): container finished" podID="cdfaf532-9147-4c9e-a080-637764031980" containerID="81dc77b6d01389da3b4acb29433c984970456abf17a18d782a54806e1c4e9d19" exitCode=0 Apr 17 14:33:00.682700 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:00.682656 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" event={"ID":"cdfaf532-9147-4c9e-a080-637764031980","Type":"ContainerDied","Data":"81dc77b6d01389da3b4acb29433c984970456abf17a18d782a54806e1c4e9d19"} Apr 17 14:33:04.701006 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:04.700914 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" event={"ID":"cdfaf532-9147-4c9e-a080-637764031980","Type":"ContainerStarted","Data":"3e1f199662697f08eee60038c402980cfaee80c9d62b0cc3a6b914b23847611d"} Apr 17 14:33:04.701406 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:04.701194 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:33:04.719002 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:04.718953 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" podStartSLOduration=1.686667168 podStartE2EDuration="15.71893943s" podCreationTimestamp="2026-04-17 14:32:49 +0000 UTC" firstStartedPulling="2026-04-17 14:32:50.247427249 +0000 UTC m=+725.383434776" lastFinishedPulling="2026-04-17 14:33:04.279699504 +0000 UTC m=+739.415707038" observedRunningTime="2026-04-17 14:33:04.718071952 +0000 UTC m=+739.854079502" watchObservedRunningTime="2026-04-17 14:33:04.71893943 +0000 UTC m=+739.854946989" Apr 17 14:33:15.717597 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:15.717569 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b" Apr 17 14:33:27.684777 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.684748 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647"] Apr 17 14:33:27.701088 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.701066 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647"] Apr 17 14:33:27.701225 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.701198 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.703970 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.703953 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 14:33:27.867813 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.867779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.867951 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.867882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx9r2\" (UniqueName: \"kubernetes.io/projected/0df7fabf-4613-4e5a-a338-e16ed7e918fe-kube-api-access-gx9r2\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.867951 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.867913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0df7fabf-4613-4e5a-a338-e16ed7e918fe-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.868040 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.868006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.868074 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.868036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.868074 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.868052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969125 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx9r2\" (UniqueName: \"kubernetes.io/projected/0df7fabf-4613-4e5a-a338-e16ed7e918fe-kube-api-access-gx9r2\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969125 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0df7fabf-4613-4e5a-a338-e16ed7e918fe-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969441 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969441 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969441 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969441 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969729 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969729 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.969817 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.969738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.971496 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.971478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0df7fabf-4613-4e5a-a338-e16ed7e918fe-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.971573 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.971557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0df7fabf-4613-4e5a-a338-e16ed7e918fe-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:27.976707 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:27.976681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx9r2\" (UniqueName: \"kubernetes.io/projected/0df7fabf-4613-4e5a-a338-e16ed7e918fe-kube-api-access-gx9r2\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-z7647\" (UID: \"0df7fabf-4613-4e5a-a338-e16ed7e918fe\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:28.013294 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:28.013259 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:28.134875 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:28.134844 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647"] Apr 17 14:33:28.135962 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:33:28.135926 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df7fabf_4613_4e5a_a338_e16ed7e918fe.slice/crio-fd86968eac88b8c964b02674362059ffd1c8f75c8b0f5e241b7ceb1d3541189a WatchSource:0}: Error finding container fd86968eac88b8c964b02674362059ffd1c8f75c8b0f5e241b7ceb1d3541189a: Status 404 returned error can't find the container with id fd86968eac88b8c964b02674362059ffd1c8f75c8b0f5e241b7ceb1d3541189a Apr 17 14:33:28.783106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:28.783071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" event={"ID":"0df7fabf-4613-4e5a-a338-e16ed7e918fe","Type":"ContainerStarted","Data":"eb6d42ef14dea5420bf27f32fa5256f0b555c5ae3f0ccb4eae469cf209a7263f"} Apr 17 14:33:28.783106 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:28.783109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" event={"ID":"0df7fabf-4613-4e5a-a338-e16ed7e918fe","Type":"ContainerStarted","Data":"fd86968eac88b8c964b02674362059ffd1c8f75c8b0f5e241b7ceb1d3541189a"} Apr 17 14:33:33.802081 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:33.802001 2577 generic.go:358] "Generic (PLEG): container finished" podID="0df7fabf-4613-4e5a-a338-e16ed7e918fe" containerID="eb6d42ef14dea5420bf27f32fa5256f0b555c5ae3f0ccb4eae469cf209a7263f" exitCode=0 Apr 17 14:33:33.802475 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:33.802077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" event={"ID":"0df7fabf-4613-4e5a-a338-e16ed7e918fe","Type":"ContainerDied","Data":"eb6d42ef14dea5420bf27f32fa5256f0b555c5ae3f0ccb4eae469cf209a7263f"} Apr 17 14:33:34.806823 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:34.806788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" event={"ID":"0df7fabf-4613-4e5a-a338-e16ed7e918fe","Type":"ContainerStarted","Data":"58054cd749e5f6160f921fce8c96d64ff2e57bc74cac3c1a21d16ca25435a582"} Apr 17 14:33:34.807208 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:34.806990 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:34.825199 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:34.825155 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" podStartSLOduration=7.551222083 podStartE2EDuration="7.825142196s" podCreationTimestamp="2026-04-17 14:33:27 +0000 UTC" firstStartedPulling="2026-04-17 14:33:33.802714979 +0000 UTC m=+768.938722506" lastFinishedPulling="2026-04-17 14:33:34.076635092 +0000 UTC m=+769.212642619" observedRunningTime="2026-04-17 14:33:34.82315044 +0000 UTC m=+769.959158000" watchObservedRunningTime="2026-04-17 14:33:34.825142196 +0000 UTC m=+769.961149744" Apr 17 14:33:45.822671 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:45.822643 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-z7647" Apr 17 14:33:52.012015 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.011983 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8557988b6b-rl4td"] Apr 17 14:33:52.037456 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.037426 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8557988b6b-rl4td"] Apr 17 14:33:52.037588 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.037528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.145336 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.145312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/293f5ae8-6f46-4e42-89a4-a4acbdf1911f-tls-cert\") pod \"authorino-8557988b6b-rl4td\" (UID: \"293f5ae8-6f46-4e42-89a4-a4acbdf1911f\") " pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.145453 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.145357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-927zb\" (UniqueName: \"kubernetes.io/projected/293f5ae8-6f46-4e42-89a4-a4acbdf1911f-kube-api-access-927zb\") pod \"authorino-8557988b6b-rl4td\" (UID: \"293f5ae8-6f46-4e42-89a4-a4acbdf1911f\") " pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.246334 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.246299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/293f5ae8-6f46-4e42-89a4-a4acbdf1911f-tls-cert\") pod \"authorino-8557988b6b-rl4td\" (UID: \"293f5ae8-6f46-4e42-89a4-a4acbdf1911f\") " pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.246460 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.246359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-927zb\" (UniqueName: \"kubernetes.io/projected/293f5ae8-6f46-4e42-89a4-a4acbdf1911f-kube-api-access-927zb\") pod \"authorino-8557988b6b-rl4td\" (UID: \"293f5ae8-6f46-4e42-89a4-a4acbdf1911f\") " pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.248596 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.248567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/293f5ae8-6f46-4e42-89a4-a4acbdf1911f-tls-cert\") pod \"authorino-8557988b6b-rl4td\" (UID: \"293f5ae8-6f46-4e42-89a4-a4acbdf1911f\") " pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.254082 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.254055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-927zb\" (UniqueName: \"kubernetes.io/projected/293f5ae8-6f46-4e42-89a4-a4acbdf1911f-kube-api-access-927zb\") pod \"authorino-8557988b6b-rl4td\" (UID: \"293f5ae8-6f46-4e42-89a4-a4acbdf1911f\") " pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.346206 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.346180 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8557988b6b-rl4td" Apr 17 14:33:52.463357 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.463333 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8557988b6b-rl4td"] Apr 17 14:33:52.464684 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:33:52.464660 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod293f5ae8_6f46_4e42_89a4_a4acbdf1911f.slice/crio-a5effbc0065ba4934f3b3497c2105d694ac5f6cb3750792b80ef871b516b3281 WatchSource:0}: Error finding container a5effbc0065ba4934f3b3497c2105d694ac5f6cb3750792b80ef871b516b3281: Status 404 returned error can't find the container with id a5effbc0065ba4934f3b3497c2105d694ac5f6cb3750792b80ef871b516b3281 Apr 17 14:33:52.868679 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:52.868649 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8557988b6b-rl4td" event={"ID":"293f5ae8-6f46-4e42-89a4-a4acbdf1911f","Type":"ContainerStarted","Data":"a5effbc0065ba4934f3b3497c2105d694ac5f6cb3750792b80ef871b516b3281"} Apr 17 14:33:53.873639 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:53.873593 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8557988b6b-rl4td" event={"ID":"293f5ae8-6f46-4e42-89a4-a4acbdf1911f","Type":"ContainerStarted","Data":"4ebad41a6d84835f6eab31516a9f66824b4fcc95dee4b49372f2562ae1e06b67"} Apr 17 14:33:53.889152 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:53.889107 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8557988b6b-rl4td" podStartSLOduration=2.507122312 podStartE2EDuration="2.889096658s" podCreationTimestamp="2026-04-17 14:33:51 +0000 UTC" firstStartedPulling="2026-04-17 14:33:52.466123184 +0000 UTC m=+787.602130710" lastFinishedPulling="2026-04-17 14:33:52.848097517 +0000 UTC m=+787.984105056" observedRunningTime="2026-04-17 14:33:53.886575866 +0000 UTC m=+789.022583416" watchObservedRunningTime="2026-04-17 14:33:53.889096658 +0000 UTC m=+789.025104206" Apr 17 14:33:53.911953 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:53.911926 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68d7c88d47-jr99f"] Apr 17 14:33:53.912141 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:53.912114 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-68d7c88d47-jr99f" podUID="ac2c6c64-9cc6-4169-8a77-73a10d06e46b" containerName="authorino" containerID="cri-o://4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913" gracePeriod=30 Apr 17 14:33:54.147283 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.147248 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:33:54.162881 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.162862 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-tls-cert\") pod \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " Apr 17 14:33:54.162963 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.162899 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlt86\" (UniqueName: \"kubernetes.io/projected/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-kube-api-access-xlt86\") pod \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\" (UID: \"ac2c6c64-9cc6-4169-8a77-73a10d06e46b\") " Apr 17 14:33:54.173958 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.172701 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-kube-api-access-xlt86" (OuterVolumeSpecName: "kube-api-access-xlt86") pod "ac2c6c64-9cc6-4169-8a77-73a10d06e46b" (UID: "ac2c6c64-9cc6-4169-8a77-73a10d06e46b"). InnerVolumeSpecName "kube-api-access-xlt86". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:33:54.180884 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.180859 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "ac2c6c64-9cc6-4169-8a77-73a10d06e46b" (UID: "ac2c6c64-9cc6-4169-8a77-73a10d06e46b"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:33:54.263668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.263644 2577 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-tls-cert\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:33:54.263668 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.263667 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xlt86\" (UniqueName: \"kubernetes.io/projected/ac2c6c64-9cc6-4169-8a77-73a10d06e46b-kube-api-access-xlt86\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:33:54.878349 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.878304 2577 generic.go:358] "Generic (PLEG): container finished" podID="ac2c6c64-9cc6-4169-8a77-73a10d06e46b" containerID="4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913" exitCode=0 Apr 17 14:33:54.878774 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.878368 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68d7c88d47-jr99f" Apr 17 14:33:54.878774 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.878379 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68d7c88d47-jr99f" event={"ID":"ac2c6c64-9cc6-4169-8a77-73a10d06e46b","Type":"ContainerDied","Data":"4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913"} Apr 17 14:33:54.878774 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.878416 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68d7c88d47-jr99f" event={"ID":"ac2c6c64-9cc6-4169-8a77-73a10d06e46b","Type":"ContainerDied","Data":"c08617b91e9409f1c0204aca958507855bcc85d09688fee6c03252df9a749f9f"} Apr 17 14:33:54.878774 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.878429 2577 scope.go:117] "RemoveContainer" containerID="4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913" Apr 17 14:33:54.887165 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.887148 2577 scope.go:117] "RemoveContainer" containerID="4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913" Apr 17 14:33:54.887449 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:33:54.887430 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913\": container with ID starting with 4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913 not found: ID does not exist" containerID="4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913" Apr 17 14:33:54.887509 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.887456 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913"} err="failed to get container status \"4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913\": rpc error: code = NotFound desc = could not find container \"4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913\": container with ID starting with 4f402418560c5d15f685a873600d3ad36490ffb373a12906841315c11880d913 not found: ID does not exist" Apr 17 14:33:54.898656 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.898633 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68d7c88d47-jr99f"] Apr 17 14:33:54.902903 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:54.902883 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-68d7c88d47-jr99f"] Apr 17 14:33:55.378332 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:33:55.378295 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2c6c64-9cc6-4169-8a77-73a10d06e46b" path="/var/lib/kubelet/pods/ac2c6c64-9cc6-4169-8a77-73a10d06e46b/volumes" Apr 17 14:35:35.232105 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:35.232032 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-57b87ccd85-4kjbl"] Apr 17 14:35:35.232588 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:35.232250 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" podUID="c9953d2d-674f-400b-86ce-077081c5b302" containerName="manager" containerID="cri-o://66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e" gracePeriod=10 Apr 17 14:35:35.468829 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:35.468809 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:35:35.569398 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:35.569322 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl8sx\" (UniqueName: \"kubernetes.io/projected/c9953d2d-674f-400b-86ce-077081c5b302-kube-api-access-cl8sx\") pod \"c9953d2d-674f-400b-86ce-077081c5b302\" (UID: \"c9953d2d-674f-400b-86ce-077081c5b302\") " Apr 17 14:35:35.571264 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:35.571235 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9953d2d-674f-400b-86ce-077081c5b302-kube-api-access-cl8sx" (OuterVolumeSpecName: "kube-api-access-cl8sx") pod "c9953d2d-674f-400b-86ce-077081c5b302" (UID: "c9953d2d-674f-400b-86ce-077081c5b302"). InnerVolumeSpecName "kube-api-access-cl8sx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:35:35.670766 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:35.670738 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cl8sx\" (UniqueName: \"kubernetes.io/projected/c9953d2d-674f-400b-86ce-077081c5b302-kube-api-access-cl8sx\") on node \"ip-10-0-132-119.ec2.internal\" DevicePath \"\"" Apr 17 14:35:36.228520 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.228483 2577 generic.go:358] "Generic (PLEG): container finished" podID="c9953d2d-674f-400b-86ce-077081c5b302" containerID="66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e" exitCode=0 Apr 17 14:35:36.228709 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.228539 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" Apr 17 14:35:36.228709 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.228569 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" event={"ID":"c9953d2d-674f-400b-86ce-077081c5b302","Type":"ContainerDied","Data":"66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e"} Apr 17 14:35:36.228709 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.228613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57b87ccd85-4kjbl" event={"ID":"c9953d2d-674f-400b-86ce-077081c5b302","Type":"ContainerDied","Data":"ffc30332b677a23dc9cb9eaed8c4883a56bc6dc9aa47b75bf8dbcbf83f1ea089"} Apr 17 14:35:36.228709 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.228633 2577 scope.go:117] "RemoveContainer" containerID="66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e" Apr 17 14:35:36.239440 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.239422 2577 scope.go:117] "RemoveContainer" containerID="66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e" Apr 17 14:35:36.239750 ip-10-0-132-119 kubenswrapper[2577]: E0417 14:35:36.239732 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e\": container with ID starting with 66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e not found: ID does not exist" containerID="66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e" Apr 17 14:35:36.239797 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.239760 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e"} err="failed to get container status \"66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e\": rpc error: code = NotFound desc = could not find container \"66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e\": container with ID starting with 66afbb9ab567902473dfcb140b4a96dea1531eef595b100a3fa05064bf850e6e not found: ID does not exist" Apr 17 14:35:36.249573 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.249550 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-57b87ccd85-4kjbl"] Apr 17 14:35:36.251417 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.251400 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-57b87ccd85-4kjbl"] Apr 17 14:35:36.937443 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.937409 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-57b87ccd85-98gfl"] Apr 17 14:35:36.937803 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.937786 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2c6c64-9cc6-4169-8a77-73a10d06e46b" containerName="authorino" Apr 17 14:35:36.937870 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.937805 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2c6c64-9cc6-4169-8a77-73a10d06e46b" containerName="authorino" Apr 17 14:35:36.937870 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.937825 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9953d2d-674f-400b-86ce-077081c5b302" containerName="manager" Apr 17 14:35:36.937870 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.937831 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9953d2d-674f-400b-86ce-077081c5b302" containerName="manager" Apr 17 14:35:36.937978 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.937883 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9953d2d-674f-400b-86ce-077081c5b302" containerName="manager" Apr 17 14:35:36.937978 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.937891 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac2c6c64-9cc6-4169-8a77-73a10d06e46b" containerName="authorino" Apr 17 14:35:36.941979 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.941960 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57b87ccd85-98gfl" Apr 17 14:35:36.945053 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.945020 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-48j2d\"" Apr 17 14:35:36.947086 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:36.947063 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57b87ccd85-98gfl"] Apr 17 14:35:37.082391 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:37.082357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg4kh\" (UniqueName: \"kubernetes.io/projected/81601fa4-37d7-4b8e-ba00-db28a6352193-kube-api-access-lg4kh\") pod \"maas-controller-57b87ccd85-98gfl\" (UID: \"81601fa4-37d7-4b8e-ba00-db28a6352193\") " pod="opendatahub/maas-controller-57b87ccd85-98gfl" Apr 17 14:35:37.182946 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:37.182909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg4kh\" (UniqueName: \"kubernetes.io/projected/81601fa4-37d7-4b8e-ba00-db28a6352193-kube-api-access-lg4kh\") pod \"maas-controller-57b87ccd85-98gfl\" (UID: \"81601fa4-37d7-4b8e-ba00-db28a6352193\") " pod="opendatahub/maas-controller-57b87ccd85-98gfl" Apr 17 14:35:37.191458 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:37.191397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg4kh\" (UniqueName: \"kubernetes.io/projected/81601fa4-37d7-4b8e-ba00-db28a6352193-kube-api-access-lg4kh\") pod \"maas-controller-57b87ccd85-98gfl\" (UID: \"81601fa4-37d7-4b8e-ba00-db28a6352193\") " pod="opendatahub/maas-controller-57b87ccd85-98gfl" Apr 17 14:35:37.253809 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:37.253768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57b87ccd85-98gfl" Apr 17 14:35:37.370455 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:35:37.370425 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81601fa4_37d7_4b8e_ba00_db28a6352193.slice/crio-28f7996a3cbc11bd95387f1887862235b77f97960a326dfd9aa25a7a421a56c7 WatchSource:0}: Error finding container 28f7996a3cbc11bd95387f1887862235b77f97960a326dfd9aa25a7a421a56c7: Status 404 returned error can't find the container with id 28f7996a3cbc11bd95387f1887862235b77f97960a326dfd9aa25a7a421a56c7 Apr 17 14:35:37.373365 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:37.373338 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9953d2d-674f-400b-86ce-077081c5b302" path="/var/lib/kubelet/pods/c9953d2d-674f-400b-86ce-077081c5b302/volumes" Apr 17 14:35:37.373617 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:37.373603 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57b87ccd85-98gfl"] Apr 17 14:35:38.242620 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:38.242591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57b87ccd85-98gfl" event={"ID":"81601fa4-37d7-4b8e-ba00-db28a6352193","Type":"ContainerStarted","Data":"66a100b1445f9e415530264c82a9a984797f140f6e8e246dec04166686ad6520"} Apr 17 14:35:38.242620 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:38.242624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57b87ccd85-98gfl" event={"ID":"81601fa4-37d7-4b8e-ba00-db28a6352193","Type":"ContainerStarted","Data":"28f7996a3cbc11bd95387f1887862235b77f97960a326dfd9aa25a7a421a56c7"} Apr 17 14:35:38.242824 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:38.242649 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-57b87ccd85-98gfl" Apr 17 14:35:38.258772 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:38.258732 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-57b87ccd85-98gfl" podStartSLOduration=1.8969933860000001 podStartE2EDuration="2.258717467s" podCreationTimestamp="2026-04-17 14:35:36 +0000 UTC" firstStartedPulling="2026-04-17 14:35:37.371632606 +0000 UTC m=+892.507640132" lastFinishedPulling="2026-04-17 14:35:37.733356671 +0000 UTC m=+892.869364213" observedRunningTime="2026-04-17 14:35:38.258213166 +0000 UTC m=+893.394220713" watchObservedRunningTime="2026-04-17 14:35:38.258717467 +0000 UTC m=+893.394725060" Apr 17 14:35:49.251160 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:35:49.251130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-57b87ccd85-98gfl" Apr 17 14:56:29.145829 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:29.145759 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-8557988b6b-rl4td_293f5ae8-6f46-4e42-89a4-a4acbdf1911f/authorino/0.log" Apr 17 14:56:33.052425 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:33.052400 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-57b87ccd85-98gfl_81601fa4-37d7-4b8e-ba00-db28a6352193/manager/0.log" Apr 17 14:56:33.277518 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:33.277487 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-58c8f88b6d-78vxn_f10dd800-e357-419d-9b22-147b74e0bc47/manager/0.log" Apr 17 14:56:33.627938 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:33.627908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-q5sn2_a6755262-ff4c-4646-bd76-ff4c39ca25ca/postgres/0.log" Apr 17 14:56:34.346323 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.346268 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr_1eb3f50d-9f0b-481f-aaf2-d8f3758f9255/util/0.log" Apr 17 14:56:34.352338 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.352321 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr_1eb3f50d-9f0b-481f-aaf2-d8f3758f9255/pull/0.log" Apr 17 14:56:34.358146 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.358118 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr_1eb3f50d-9f0b-481f-aaf2-d8f3758f9255/extract/0.log" Apr 17 14:56:34.462099 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.462076 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t_8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9/extract/0.log" Apr 17 14:56:34.470367 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.470351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t_8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9/util/0.log" Apr 17 14:56:34.476437 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.476419 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t_8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9/pull/0.log" Apr 17 14:56:34.581880 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.581854 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2_b5c19c02-bef5-4ade-b289-4baa372d80b6/pull/0.log" Apr 17 14:56:34.587059 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.587041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2_b5c19c02-bef5-4ade-b289-4baa372d80b6/extract/0.log" Apr 17 14:56:34.592080 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.592059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2_b5c19c02-bef5-4ade-b289-4baa372d80b6/util/0.log" Apr 17 14:56:34.695216 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.695161 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn_de2a72e9-9dc9-42da-92da-6ec4e6d31130/pull/0.log" Apr 17 14:56:34.700371 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.700355 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn_de2a72e9-9dc9-42da-92da-6ec4e6d31130/extract/0.log" Apr 17 14:56:34.704904 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.704888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn_de2a72e9-9dc9-42da-92da-6ec4e6d31130/util/0.log" Apr 17 14:56:34.808888 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.808865 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-8557988b6b-rl4td_293f5ae8-6f46-4e42-89a4-a4acbdf1911f/authorino/0.log" Apr 17 14:56:34.916321 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:34.916301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-srl6t_de3b7e26-c63e-4f0e-a493-a47aa30ae72d/manager/0.log" Apr 17 14:56:35.230589 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:35.230566 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-rcxsj_c7c6cb74-41c3-441f-9f18-37a646366315/registry-server/0.log" Apr 17 14:56:35.923375 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:35.923347 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw_01124663-3743-45ae-a6fa-311bde303bce/istio-proxy/0.log" Apr 17 14:56:36.355848 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:36.355824 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-qsghl_9dfa6f2e-76f9-430c-a6a4-da7ac010f854/istio-proxy/0.log" Apr 17 14:56:36.769525 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:36.769455 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-z7647_0df7fabf-4613-4e5a-a338-e16ed7e918fe/storage-initializer/0.log" Apr 17 14:56:36.775775 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:36.775756 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-z7647_0df7fabf-4613-4e5a-a338-e16ed7e918fe/main/0.log" Apr 17 14:56:37.230981 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:37.230944 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b_cdfaf532-9147-4c9e-a080-637764031980/storage-initializer/0.log" Apr 17 14:56:37.238755 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:37.238731 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-nq65b_cdfaf532-9147-4c9e-a080-637764031980/main/0.log" Apr 17 14:56:43.871055 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:43.871010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-zjwv7_c7c4ea84-7290-4b62-b947-cbedb375e0f9/global-pull-secret-syncer/0.log" Apr 17 14:56:43.961986 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:43.961957 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nz4xc_d030cbc2-cc1b-40c2-8101-cd9ed0460d1e/konnectivity-agent/0.log" Apr 17 14:56:44.013813 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:44.013767 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-119.ec2.internal_f246f22bb4cc0196c51370027afd49f2/haproxy/0.log" Apr 17 14:56:47.907781 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:47.907743 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr_1eb3f50d-9f0b-481f-aaf2-d8f3758f9255/extract/0.log" Apr 17 14:56:47.932226 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:47.932206 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr_1eb3f50d-9f0b-481f-aaf2-d8f3758f9255/util/0.log" Apr 17 14:56:47.959618 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:47.959595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592gwzr_1eb3f50d-9f0b-481f-aaf2-d8f3758f9255/pull/0.log" Apr 17 14:56:47.991344 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:47.991322 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t_8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9/extract/0.log" Apr 17 14:56:48.018948 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.018928 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t_8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9/util/0.log" Apr 17 14:56:48.039597 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.039579 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0kcr7t_8301aa03-f89c-4ea3-9cac-3e6b4e7a3ba9/pull/0.log" Apr 17 14:56:48.065889 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.065871 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2_b5c19c02-bef5-4ade-b289-4baa372d80b6/extract/0.log" Apr 17 14:56:48.083243 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.083227 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2_b5c19c02-bef5-4ade-b289-4baa372d80b6/util/0.log" Apr 17 14:56:48.103254 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.103236 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73q7vl2_b5c19c02-bef5-4ade-b289-4baa372d80b6/pull/0.log" Apr 17 14:56:48.140165 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.140134 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn_de2a72e9-9dc9-42da-92da-6ec4e6d31130/extract/0.log" Apr 17 14:56:48.162024 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.161964 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn_de2a72e9-9dc9-42da-92da-6ec4e6d31130/util/0.log" Apr 17 14:56:48.183894 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.183874 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t2gtn_de2a72e9-9dc9-42da-92da-6ec4e6d31130/pull/0.log" Apr 17 14:56:48.367501 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.367473 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-8557988b6b-rl4td_293f5ae8-6f46-4e42-89a4-a4acbdf1911f/authorino/0.log" Apr 17 14:56:48.434382 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.434309 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-srl6t_de3b7e26-c63e-4f0e-a493-a47aa30ae72d/manager/0.log" Apr 17 14:56:48.581564 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:48.581531 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-rcxsj_c7c6cb74-41c3-441f-9f18-37a646366315/registry-server/0.log" Apr 17 14:56:50.664528 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:50.664498 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppx4t_7312e194-131a-4247-8dbf-6ca7a8f6fa14/node-exporter/0.log" Apr 17 14:56:50.683860 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:50.683842 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppx4t_7312e194-131a-4247-8dbf-6ca7a8f6fa14/kube-rbac-proxy/0.log" Apr 17 14:56:50.700769 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:50.700744 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ppx4t_7312e194-131a-4247-8dbf-6ca7a8f6fa14/init-textfile/0.log" Apr 17 14:56:52.316137 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.316107 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-ch6nv_4a326186-1f33-45e3-bf03-b51a1846d9da/networking-console-plugin/0.log" Apr 17 14:56:52.555992 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.555958 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc"] Apr 17 14:56:52.559403 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.559382 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.564208 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.564188 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fr76n\"/\"kube-root-ca.crt\"" Apr 17 14:56:52.565689 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.565668 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fr76n\"/\"openshift-service-ca.crt\"" Apr 17 14:56:52.565752 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.565699 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fr76n\"/\"default-dockercfg-n2jw9\"" Apr 17 14:56:52.573504 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.573486 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc"] Apr 17 14:56:52.713710 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.713679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-lib-modules\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.713710 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.713711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z78fj\" (UniqueName: \"kubernetes.io/projected/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-kube-api-access-z78fj\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.713947 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.713738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-sys\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.713947 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.713776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-podres\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.713947 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.713797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-proc\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815048 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-podres\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815048 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-proc\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815226 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-lib-modules\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815226 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z78fj\" (UniqueName: \"kubernetes.io/projected/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-kube-api-access-z78fj\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815226 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-sys\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815226 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-podres\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815226 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-proc\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815426 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-sys\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.815426 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.815241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-lib-modules\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.825193 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.825101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z78fj\" (UniqueName: \"kubernetes.io/projected/ac45cb33-97a6-4880-bf19-e1477b3aa3a5-kube-api-access-z78fj\") pod \"perf-node-gather-daemonset-c2dqc\" (UID: \"ac45cb33-97a6-4880-bf19-e1477b3aa3a5\") " pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:52.869080 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:52.869043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:53.017353 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:53.017326 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc"] Apr 17 14:56:53.019074 ip-10-0-132-119 kubenswrapper[2577]: W0417 14:56:53.019048 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podac45cb33_97a6_4880_bf19_e1477b3aa3a5.slice/crio-67d3fe646ba3125b776ac44c81666c9171dc204fe01c863c979e258d98aca568 WatchSource:0}: Error finding container 67d3fe646ba3125b776ac44c81666c9171dc204fe01c863c979e258d98aca568: Status 404 returned error can't find the container with id 67d3fe646ba3125b776ac44c81666c9171dc204fe01c863c979e258d98aca568 Apr 17 14:56:53.020725 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:53.020702 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:56:53.668567 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:53.668526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" event={"ID":"ac45cb33-97a6-4880-bf19-e1477b3aa3a5","Type":"ContainerStarted","Data":"c3189988c167f8f860d3a8b3d586837fa5cfc0a85569df7b660d1ea93c601077"} Apr 17 14:56:53.668567 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:53.668572 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" event={"ID":"ac45cb33-97a6-4880-bf19-e1477b3aa3a5","Type":"ContainerStarted","Data":"67d3fe646ba3125b776ac44c81666c9171dc204fe01c863c979e258d98aca568"} Apr 17 14:56:53.669060 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:53.668644 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:56:53.691020 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:53.690964 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" podStartSLOduration=1.690951918 podStartE2EDuration="1.690951918s" podCreationTimestamp="2026-04-17 14:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:56:53.688506194 +0000 UTC m=+2168.824513741" watchObservedRunningTime="2026-04-17 14:56:53.690951918 +0000 UTC m=+2168.826959524" Apr 17 14:56:54.812614 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:54.812584 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wf85s_059af305-b8f4-4631-aba4-e42fe75f0259/dns/0.log" Apr 17 14:56:54.835676 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:54.835657 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wf85s_059af305-b8f4-4631-aba4-e42fe75f0259/kube-rbac-proxy/0.log" Apr 17 14:56:54.887652 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:54.887632 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gqkmm_565f7614-6003-428c-a0bd-ff0f395baa33/dns-node-resolver/0.log" Apr 17 14:56:55.386340 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:55.386267 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d8pjn_26b0d440-2cba-4402-b258-ba4b4ac2f7dd/node-ca/0.log" Apr 17 14:56:56.168454 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:56.168425 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfd7nrw_01124663-3743-45ae-a6fa-311bde303bce/istio-proxy/0.log" Apr 17 14:56:56.353439 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:56.353408 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-qsghl_9dfa6f2e-76f9-430c-a6a4-da7ac010f854/istio-proxy/0.log" Apr 17 14:56:56.883463 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:56.883417 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dxkfs_da67d3ea-dc78-4d70-8ed3-1bcb7edf4d9e/serve-healthcheck-canary/0.log" Apr 17 14:56:57.332756 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:57.332729 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56xpp_4b966141-1693-476e-a47e-c6acdd7edec5/kube-rbac-proxy/0.log" Apr 17 14:56:57.351576 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:57.351553 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56xpp_4b966141-1693-476e-a47e-c6acdd7edec5/exporter/0.log" Apr 17 14:56:57.372069 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:57.372034 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56xpp_4b966141-1693-476e-a47e-c6acdd7edec5/extractor/0.log" Apr 17 14:56:59.379298 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:59.379261 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-57b87ccd85-98gfl_81601fa4-37d7-4b8e-ba00-db28a6352193/manager/0.log" Apr 17 14:56:59.451094 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:59.451051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-58c8f88b6d-78vxn_f10dd800-e357-419d-9b22-147b74e0bc47/manager/0.log" Apr 17 14:56:59.509835 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:59.509810 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-q5sn2_a6755262-ff4c-4646-bd76-ff4c39ca25ca/postgres/0.log" Apr 17 14:56:59.683947 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:56:59.683868 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fr76n/perf-node-gather-daemonset-c2dqc" Apr 17 14:57:00.558840 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:00.558793 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5b89f4cf56-s7c45_dbb47d3a-1bc6-4625-8bca-9418a2f18d10/manager/0.log" Apr 17 14:57:06.382291 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.382248 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-25j4z_e40076ff-ba56-43e8-88a4-8c25998b6668/kube-multus-additional-cni-plugins/0.log" Apr 17 14:57:06.417958 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.417928 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-25j4z_e40076ff-ba56-43e8-88a4-8c25998b6668/egress-router-binary-copy/0.log" Apr 17 14:57:06.460190 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.460167 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-25j4z_e40076ff-ba56-43e8-88a4-8c25998b6668/cni-plugins/0.log" Apr 17 14:57:06.487234 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.487211 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-25j4z_e40076ff-ba56-43e8-88a4-8c25998b6668/bond-cni-plugin/0.log" Apr 17 14:57:06.511572 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.511539 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-25j4z_e40076ff-ba56-43e8-88a4-8c25998b6668/routeoverride-cni/0.log" Apr 17 14:57:06.533724 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.533698 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-25j4z_e40076ff-ba56-43e8-88a4-8c25998b6668/whereabouts-cni-bincopy/0.log" Apr 17 14:57:06.557652 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.557621 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-25j4z_e40076ff-ba56-43e8-88a4-8c25998b6668/whereabouts-cni/0.log" Apr 17 14:57:06.920654 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.920627 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6jws_60e3021c-3d8c-46bc-b4d0-5b2c13f6f6e3/kube-multus/0.log" Apr 17 14:57:06.966411 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.966378 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9qc7k_3042fc33-2fc3-4d3d-a248-3855f7eb3a6a/network-metrics-daemon/0.log" Apr 17 14:57:06.982527 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:06.982503 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9qc7k_3042fc33-2fc3-4d3d-a248-3855f7eb3a6a/kube-rbac-proxy/0.log" Apr 17 14:57:08.460429 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.460391 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/ovn-controller/0.log" Apr 17 14:57:08.483926 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.483895 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/ovn-acl-logging/0.log" Apr 17 14:57:08.500485 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.500462 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/kube-rbac-proxy-node/0.log" Apr 17 14:57:08.517241 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.517215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:57:08.531987 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.531965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/northd/0.log" Apr 17 14:57:08.550532 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.550508 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/nbdb/0.log" Apr 17 14:57:08.567435 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.567413 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/sbdb/0.log" Apr 17 14:57:08.656930 ip-10-0-132-119 kubenswrapper[2577]: I0417 14:57:08.656904 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zns76_25426a81-851e-4273-9383-f90518fec0e7/ovnkube-controller/0.log"