May 11 20:50:21.755347 ip-10-0-128-58 systemd[1]: Starting Kubernetes Kubelet... May 11 20:50:22.253561 ip-10-0-128-58 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:22.253561 ip-10-0-128-58 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. May 11 20:50:22.253561 ip-10-0-128-58 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:22.253561 ip-10-0-128-58 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 11 20:50:22.253561 ip-10-0-128-58 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:22.256221 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.256124 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 11 20:50:22.261016 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.260995 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:22.261016 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261012 2567 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:22.261016 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261016 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:22.261016 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261019 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:22.261016 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261023 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261026 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261030 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261033 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261035 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261039 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261042 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261045 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261049 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261053 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261056 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261059 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261062 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261065 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261067 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261070 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261073 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261076 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261078 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:22.261186 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261081 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261083 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261095 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261098 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261100 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261103 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261106 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261108 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261111 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261114 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261117 2567 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261121 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261124 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261127 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261129 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261132 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261136 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261138 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261141 2567 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261144 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:22.261714 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261147 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261149 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261152 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261154 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261157 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261159 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261161 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261164 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261166 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261169 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261172 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261174 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261177 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261179 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261181 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261184 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261186 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261189 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261192 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261194 2567 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:22.262240 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261197 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261200 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261202 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261205 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261207 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261210 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261212 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261216 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261219 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261229 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261231 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261234 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261236 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261240 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261243 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261246 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261248 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261251 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261254 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261256 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:22.262723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261259 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261262 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261264 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261634 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261639 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261642 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261644 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261647 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261649 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261653 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261655 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261658 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261661 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261664 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261666 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261669 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261672 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261674 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261677 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:22.263233 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261679 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261682 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261685 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261687 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261690 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261708 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261712 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261716 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261719 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261722 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261725 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261728 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261731 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261734 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261736 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261739 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261741 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261744 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261748 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:22.263683 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261751 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261754 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261757 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261761 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261764 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261768 2567 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261771 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261773 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261776 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261779 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261782 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261784 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261787 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261790 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261793 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261796 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261799 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261801 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261804 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:22.264165 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261806 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261809 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261811 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261815 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261817 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261820 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261822 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261825 2567 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261827 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261830 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261832 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261834 2567 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261837 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261839 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261842 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261845 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261847 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261850 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261852 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261855 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:22.264630 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261858 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261860 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261863 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261867 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261869 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261871 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261875 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261877 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261880 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261883 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261885 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.261888 2567 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263150 2567 flags.go:64] FLAG: --address="0.0.0.0" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263160 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263166 2567 flags.go:64] FLAG: --anonymous-auth="true" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263172 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263177 2567 flags.go:64] FLAG: --authentication-token-webhook="false" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263181 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263186 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263190 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263193 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" May 11 20:50:22.265152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263196 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263200 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263203 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263206 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263209 2567 flags.go:64] FLAG: --cgroup-root="" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263212 2567 flags.go:64] FLAG: --cgroups-per-qos="true" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263215 2567 flags.go:64] FLAG: --client-ca-file="" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263218 2567 flags.go:64] FLAG: --cloud-config="" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263220 2567 flags.go:64] FLAG: --cloud-provider="external" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263223 2567 flags.go:64] FLAG: --cluster-dns="[]" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263228 2567 flags.go:64] FLAG: --cluster-domain="" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263231 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263234 2567 flags.go:64] FLAG: --config-dir="" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263237 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263241 2567 flags.go:64] FLAG: --container-log-max-files="5" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263245 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263248 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263252 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263255 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263258 2567 flags.go:64] FLAG: --contention-profiling="false" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263261 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263264 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263268 2567 flags.go:64] FLAG: --cpu-manager-policy="none" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263270 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263274 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" May 11 20:50:22.265663 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263277 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263281 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263284 2567 flags.go:64] FLAG: --enable-load-reader="false" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263287 2567 flags.go:64] FLAG: --enable-server="true" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263290 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263294 2567 flags.go:64] FLAG: --event-burst="100" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263297 2567 flags.go:64] FLAG: --event-qps="50" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263300 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263304 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263306 2567 flags.go:64] FLAG: --eviction-hard="" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263310 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263313 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263317 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263320 2567 flags.go:64] FLAG: --eviction-soft="" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263323 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263326 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263328 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263331 2567 flags.go:64] FLAG: --experimental-mounter-path="" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263334 2567 flags.go:64] FLAG: --fail-cgroupv1="false" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263337 2567 flags.go:64] FLAG: --fail-swap-on="true" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263340 2567 flags.go:64] FLAG: --feature-gates="" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263344 2567 flags.go:64] FLAG: --file-check-frequency="20s" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263347 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263350 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263353 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263357 2567 flags.go:64] FLAG: --healthz-port="10248" May 11 20:50:22.266275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263360 2567 flags.go:64] FLAG: --help="false" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263363 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-128-58.ec2.internal" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263366 2567 flags.go:64] FLAG: --housekeeping-interval="10s" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263369 2567 flags.go:64] FLAG: --http-check-frequency="20s" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263371 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263375 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263378 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263382 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263385 2567 flags.go:64] FLAG: --image-service-endpoint="" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263388 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263391 2567 flags.go:64] FLAG: --kube-api-burst="100" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263394 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263397 2567 flags.go:64] FLAG: --kube-api-qps="50" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263400 2567 flags.go:64] FLAG: --kube-reserved="" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263403 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263407 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263410 2567 flags.go:64] FLAG: --kubelet-cgroups="" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263412 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263415 2567 flags.go:64] FLAG: --lock-file="" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263419 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263422 2567 flags.go:64] FLAG: --log-flush-frequency="5s" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263424 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263430 2567 flags.go:64] FLAG: --log-json-split-stream="false" May 11 20:50:22.266927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263432 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263435 2567 flags.go:64] FLAG: --log-text-split-stream="false" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263438 2567 flags.go:64] FLAG: --logging-format="text" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263441 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263444 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263447 2567 flags.go:64] FLAG: --manifest-url="" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263450 2567 flags.go:64] FLAG: --manifest-url-header="" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263455 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263458 2567 flags.go:64] FLAG: --max-open-files="1000000" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263462 2567 flags.go:64] FLAG: --max-pods="110" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263465 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263468 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263471 2567 flags.go:64] FLAG: --memory-manager-policy="None" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263474 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263477 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263480 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263483 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263497 2567 flags.go:64] FLAG: --node-status-max-images="50" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263500 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263503 2567 flags.go:64] FLAG: --oom-score-adj="-999" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263507 2567 flags.go:64] FLAG: --pod-cidr="" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263509 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3fc6c2cc09f271efd3cd2adb6c984c7cab48ea53dad824c952dee91afa8eaa20" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263514 2567 flags.go:64] FLAG: --pod-manifest-path="" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263517 2567 flags.go:64] FLAG: --pod-max-pids="-1" May 11 20:50:22.267510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263520 2567 flags.go:64] FLAG: --pods-per-core="0" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263523 2567 flags.go:64] FLAG: --port="10250" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263527 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263530 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08e7f2947607c9189" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263533 2567 flags.go:64] FLAG: --qos-reserved="" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263535 2567 flags.go:64] FLAG: --read-only-port="10255" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263538 2567 flags.go:64] FLAG: --register-node="true" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263541 2567 flags.go:64] FLAG: --register-schedulable="true" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263544 2567 flags.go:64] FLAG: --register-with-taints="" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263548 2567 flags.go:64] FLAG: --registry-burst="10" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263550 2567 flags.go:64] FLAG: --registry-qps="5" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263553 2567 flags.go:64] FLAG: --reserved-cpus="" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263556 2567 flags.go:64] FLAG: --reserved-memory="" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263560 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263563 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263566 2567 flags.go:64] FLAG: --rotate-certificates="false" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263569 2567 flags.go:64] FLAG: --rotate-server-certificates="false" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263571 2567 flags.go:64] FLAG: --runonce="false" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263574 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263577 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263580 2567 flags.go:64] FLAG: --seccomp-default="false" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263583 2567 flags.go:64] FLAG: --serialize-image-pulls="true" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263586 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263589 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263594 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263597 2567 flags.go:64] FLAG: --storage-driver-password="root" May 11 20:50:22.268146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263600 2567 flags.go:64] FLAG: --storage-driver-secure="false" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263603 2567 flags.go:64] FLAG: --storage-driver-table="stats" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263605 2567 flags.go:64] FLAG: --storage-driver-user="root" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263609 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263612 2567 flags.go:64] FLAG: --sync-frequency="1m0s" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263615 2567 flags.go:64] FLAG: --system-cgroups="" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263617 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263623 2567 flags.go:64] FLAG: --system-reserved-cgroup="" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263625 2567 flags.go:64] FLAG: --tls-cert-file="" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263628 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263633 2567 flags.go:64] FLAG: --tls-min-version="" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263635 2567 flags.go:64] FLAG: --tls-private-key-file="" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263638 2567 flags.go:64] FLAG: --topology-manager-policy="none" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263641 2567 flags.go:64] FLAG: --topology-manager-policy-options="" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263644 2567 flags.go:64] FLAG: --topology-manager-scope="container" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263646 2567 flags.go:64] FLAG: --v="2" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263650 2567 flags.go:64] FLAG: --version="false" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263655 2567 flags.go:64] FLAG: --vmodule="" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263658 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.263661 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263747 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263753 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263757 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263760 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:22.268772 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263763 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263766 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263769 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263772 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263775 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263778 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263782 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263785 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263788 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263790 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263793 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263796 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263799 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263802 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263804 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263807 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263809 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263812 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263814 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263817 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:22.269359 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263819 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263822 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263824 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263827 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263830 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263832 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263834 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263837 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263839 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263843 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263846 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263848 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263851 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263853 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263856 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263859 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263861 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263864 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263867 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:22.269871 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263870 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263873 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263875 2567 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263878 2567 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263882 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263885 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263889 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263893 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263896 2567 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263899 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263902 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263905 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263907 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263910 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263912 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263915 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263917 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263920 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263922 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263924 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:22.270363 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263927 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263929 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263933 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263935 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263938 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263941 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263943 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263946 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263948 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263951 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263953 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263969 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263972 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263974 2567 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263977 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263979 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263982 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263984 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263987 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263991 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:22.270848 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263994 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263996 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.263999 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.264985 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.271266 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.10" May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.271282 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271328 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271332 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271336 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271339 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271342 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271345 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271347 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271350 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271353 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271355 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:22.271365 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271358 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271360 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271363 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271366 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271369 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271372 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271375 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271378 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271381 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271384 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271386 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271389 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271392 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271394 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271397 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271400 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271402 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271404 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271407 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:22.271790 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271410 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271412 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271416 2567 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271418 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271421 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271425 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271430 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271433 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271436 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271438 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271441 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271444 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271446 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271449 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271451 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271456 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271460 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271463 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271466 2567 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:22.272274 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271468 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271471 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271473 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271476 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271479 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271481 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271484 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271487 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271489 2567 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271492 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271494 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271497 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271499 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271502 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271504 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271507 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271510 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271513 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271515 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271518 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:22.272744 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271520 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271523 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271525 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271528 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271530 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271533 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271535 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271538 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271540 2567 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271543 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271546 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271548 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271551 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271554 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271556 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271559 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271561 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:22.273260 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271564 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.271569 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271659 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271663 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271666 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271669 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271672 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271674 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271677 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271679 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271682 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271685 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271688 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271691 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271693 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:22.273667 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271696 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271698 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271701 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271703 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271707 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271711 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271714 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271717 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271719 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271722 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271725 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271728 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271730 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271733 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271735 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271738 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271740 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271743 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271745 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:22.274050 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271747 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271750 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271753 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271755 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271758 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271760 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271763 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271765 2567 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271768 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271771 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271773 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271776 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271779 2567 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271781 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271784 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271787 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271789 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271791 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271794 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271797 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:22.274516 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271800 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271802 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271805 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271808 2567 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271810 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271813 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271815 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271818 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271820 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271823 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271825 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271828 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271830 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271833 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271836 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271838 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271840 2567 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271843 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271845 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271848 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:22.275071 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271850 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271853 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271855 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271857 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271860 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271863 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271865 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271868 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271870 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271873 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271876 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271879 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271883 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:22.271885 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.271890 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:22.275549 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.272653 2567 server.go:962] "Client rotation is on, will bootstrap in background" May 11 20:50:22.276345 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.276331 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" May 11 20:50:22.277500 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.277489 2567 server.go:1019] "Starting client certificate rotation" May 11 20:50:22.277603 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.277587 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" May 11 20:50:22.277637 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.277630 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" May 11 20:50:22.304372 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.304355 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 11 20:50:22.310141 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.310123 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 11 20:50:22.324876 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.324763 2567 log.go:25] "Validated CRI v1 runtime API" May 11 20:50:22.331732 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.331718 2567 log.go:25] "Validated CRI v1 image API" May 11 20:50:22.332009 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.331988 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 11 20:50:22.334323 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.334307 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 11 20:50:22.339168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.339148 2567 fs.go:135] Filesystem UUIDs: map[53b66402-7330-4034-abda-08c53426d2c7:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f5f81108-4e53-4af1-9932-a815a99031cc:/dev/nvme0n1p4] May 11 20:50:22.339221 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.339168 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] May 11 20:50:22.345196 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.345098 2567 manager.go:217] Machine: {Timestamp:2026-05-11 20:50:22.342739269 +0000 UTC m=+0.452519706 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2500004 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec202fc374265032cdbbb389df346c9e SystemUUID:ec202fc3-7426-5032-cdbb-b389df346c9e BootID:d0350ca1-6aac-4b1e-8464-4208362791a2 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:87:00:6d:5a:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:87:00:6d:5a:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:03:63:ef:fb:5d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} May 11 20:50:22.345196 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.345195 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. May 11 20:50:22.345309 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.345271 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.112.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260504-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} May 11 20:50:22.346412 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.346391 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 11 20:50:22.346537 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.346413 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-58.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 11 20:50:22.346585 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.346546 2567 topology_manager.go:138] "Creating topology manager with none policy" May 11 20:50:22.346585 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.346555 2567 container_manager_linux.go:306] "Creating device plugin manager" May 11 20:50:22.346585 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.346568 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" May 11 20:50:22.347420 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.347409 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" May 11 20:50:22.348719 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.348709 2567 state_mem.go:36] "Initialized new in-memory state store" May 11 20:50:22.348829 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.348820 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" May 11 20:50:22.350303 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.350288 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4bvs9" May 11 20:50:22.351300 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.351289 2567 kubelet.go:491] "Attempting to sync node with API server" May 11 20:50:22.351334 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.351305 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" May 11 20:50:22.351334 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.351316 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" May 11 20:50:22.351334 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.351324 2567 kubelet.go:397] "Adding apiserver pod source" May 11 20:50:22.351334 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.351332 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 11 20:50:22.353572 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.353559 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 11 20:50:22.353618 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.353579 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 11 20:50:22.356447 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.356432 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.11-2.rhaos4.20.gitb2a8320.el9" apiVersion="v1" May 11 20:50:22.357748 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.357736 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 11 20:50:22.359584 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359566 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" May 11 20:50:22.359635 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359597 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" May 11 20:50:22.359635 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359609 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" May 11 20:50:22.359635 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359620 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" May 11 20:50:22.359635 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359632 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" May 11 20:50:22.359745 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359643 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" May 11 20:50:22.359745 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359658 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" May 11 20:50:22.359745 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359669 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" May 11 20:50:22.359745 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359683 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" May 11 20:50:22.359745 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359707 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" May 11 20:50:22.359745 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359742 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" May 11 20:50:22.359907 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.359758 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" May 11 20:50:22.360258 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.360243 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4bvs9" May 11 20:50:22.360664 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.360649 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" May 11 20:50:22.360664 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.360660 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" May 11 20:50:22.363493 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.363470 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 11 20:50:22.363559 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.363470 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-58.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 11 20:50:22.364110 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.364098 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 11 20:50:22.364163 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.364133 2567 server.go:1295] "Started kubelet" May 11 20:50:22.364285 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.364251 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 11 20:50:22.364321 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.364299 2567 server_v1.go:47] "podresources" method="list" useActivePods=true May 11 20:50:22.364648 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.364371 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 11 20:50:22.364861 ip-10-0-128-58 systemd[1]: Started Kubernetes Kubelet. May 11 20:50:22.365507 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.365488 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 11 20:50:22.368986 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.368970 2567 server.go:317] "Adding debug handlers to kubelet server" May 11 20:50:22.371043 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.371027 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-58.ec2.internal" not found May 11 20:50:22.373564 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.373547 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 11 20:50:22.373564 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.373560 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" May 11 20:50:22.374109 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.374089 2567 volume_manager.go:295] "The desired_state_of_world populator starts" May 11 20:50:22.374197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.374112 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" May 11 20:50:22.374197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.374143 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 11 20:50:22.374309 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.374242 2567 reconstruct.go:97] "Volume reconstruction finished" May 11 20:50:22.374309 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.374254 2567 reconciler.go:26] "Reconciler: start to sync state" May 11 20:50:22.374309 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.374272 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:22.375000 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.374977 2567 factory.go:55] Registering systemd factory May 11 20:50:22.375000 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.375002 2567 factory.go:223] Registration of the systemd container factory successfully May 11 20:50:22.375368 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.375197 2567 factory.go:153] Registering CRI-O factory May 11 20:50:22.375368 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.375209 2567 factory.go:223] Registration of the crio container factory successfully May 11 20:50:22.375368 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.375269 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory May 11 20:50:22.375368 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.375296 2567 factory.go:103] Registering Raw factory May 11 20:50:22.375368 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.375326 2567 manager.go:1196] Started watching for new ooms in manager May 11 20:50:22.375830 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.375716 2567 manager.go:319] Starting recovery of all containers May 11 20:50:22.376045 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.376020 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" May 11 20:50:22.376140 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.376094 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:22.379190 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.379164 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-58.ec2.internal\" not found" node="ip-10-0-128-58.ec2.internal" May 11 20:50:22.385225 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.385213 2567 manager.go:324] Recovery completed May 11 20:50:22.388812 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.388799 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:22.389116 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.389045 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-58.ec2.internal" not found May 11 20:50:22.391186 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.391172 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:22.391243 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.391197 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:22.391243 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.391209 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:22.391626 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.391613 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" May 11 20:50:22.391682 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.391627 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" May 11 20:50:22.391682 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.391645 2567 state_mem.go:36] "Initialized new in-memory state store" May 11 20:50:22.394531 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.394518 2567 policy_none.go:49] "None policy: Start" May 11 20:50:22.394574 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.394536 2567 memory_manager.go:186] "Starting memorymanager" policy="None" May 11 20:50:22.394574 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.394545 2567 state_mem.go:35] "Initializing new in-memory state store" May 11 20:50:22.427507 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.427495 2567 manager.go:341] "Starting Device Plugin manager" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.427538 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.427548 2567 server.go:85] "Starting device plugin registration server" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.427786 2567 eviction_manager.go:189] "Eviction manager: starting control loop" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.427796 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.427985 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.428411 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.428421 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.430313 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" May 11 20:50:22.443970 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.430358 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:22.445355 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.445340 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-58.ec2.internal" not found May 11 20:50:22.473220 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.473195 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 11 20:50:22.474347 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.474325 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 11 20:50:22.474347 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.474346 2567 status_manager.go:230] "Starting to sync pod status with apiserver" May 11 20:50:22.474496 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.474362 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 11 20:50:22.474496 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.474369 2567 kubelet.go:2451] "Starting kubelet main sync loop" May 11 20:50:22.474496 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.474398 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" May 11 20:50:22.476735 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.476711 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:22.528460 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.528400 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:22.529344 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.529330 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:22.529411 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.529358 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:22.529411 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.529375 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:22.529411 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.529401 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-58.ec2.internal" May 11 20:50:22.538795 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.538782 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-58.ec2.internal" May 11 20:50:22.538841 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.538801 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-58.ec2.internal\": node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:22.551582 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.551562 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:22.574515 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.574486 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal"] May 11 20:50:22.574572 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.574554 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:22.575369 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.575354 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:22.575433 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.575382 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:22.575433 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.575393 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:22.576498 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.576486 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:22.576653 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.576640 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.576689 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.576669 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:22.577204 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.577189 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:22.577269 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.577212 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:22.577269 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.577215 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:22.577269 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.577222 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:22.577269 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.577234 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:22.577269 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.577243 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:22.578284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.578271 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" May 11 20:50:22.578326 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.578294 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:22.578917 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.578898 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:22.579007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.578930 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:22.579007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.578941 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:22.606443 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.606425 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-58.ec2.internal\" not found" node="ip-10-0-128-58.ec2.internal" May 11 20:50:22.610722 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.610702 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-58.ec2.internal\" not found" node="ip-10-0-128-58.ec2.internal" May 11 20:50:22.651861 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.651837 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:22.752292 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.752267 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:22.775539 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.775519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/617b24297d13d590f6c983d42c59bc7e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal\" (UID: \"617b24297d13d590f6c983d42c59bc7e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.775617 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.775555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c5eff5275aeb9307128a4ad3171d6f0-config\") pod \"kube-apiserver-proxy-ip-10-0-128-58.ec2.internal\" (UID: \"2c5eff5275aeb9307128a4ad3171d6f0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" May 11 20:50:22.775617 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.775584 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/617b24297d13d590f6c983d42c59bc7e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal\" (UID: \"617b24297d13d590f6c983d42c59bc7e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.852696 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.852669 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:22.876174 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.876157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c5eff5275aeb9307128a4ad3171d6f0-config\") pod \"kube-apiserver-proxy-ip-10-0-128-58.ec2.internal\" (UID: \"2c5eff5275aeb9307128a4ad3171d6f0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" May 11 20:50:22.876226 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.876182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/617b24297d13d590f6c983d42c59bc7e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal\" (UID: \"617b24297d13d590f6c983d42c59bc7e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.876226 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.876207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/617b24297d13d590f6c983d42c59bc7e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal\" (UID: \"617b24297d13d590f6c983d42c59bc7e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.876286 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.876247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c5eff5275aeb9307128a4ad3171d6f0-config\") pod \"kube-apiserver-proxy-ip-10-0-128-58.ec2.internal\" (UID: \"2c5eff5275aeb9307128a4ad3171d6f0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" May 11 20:50:22.876319 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.876287 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/617b24297d13d590f6c983d42c59bc7e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal\" (UID: \"617b24297d13d590f6c983d42c59bc7e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.876319 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.876313 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/617b24297d13d590f6c983d42c59bc7e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal\" (UID: \"617b24297d13d590f6c983d42c59bc7e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.910281 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.910263 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:22.912984 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:22.912845 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" May 11 20:50:22.953719 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:22.953698 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.054218 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:23.054195 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.154693 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:23.154628 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.255269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:23.255236 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.277550 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.277525 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" May 11 20:50:23.277702 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.277667 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 11 20:50:23.277702 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.277678 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 11 20:50:23.356193 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:23.356162 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.362279 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.362248 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-05-10 20:45:22 +0000 UTC" deadline="2028-01-02 14:49:26.602220016 +0000 UTC" May 11 20:50:23.362279 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.362277 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14417h59m3.239946219s" May 11 20:50:23.374423 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.374402 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" May 11 20:50:23.394263 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.394244 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 11 20:50:23.400829 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:23.400795 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod617b24297d13d590f6c983d42c59bc7e.slice/crio-c51ca85208f1d4ed4b4ac8b8a8b39ab9d8b4e262803cd0e4a30b884d9a59c1a8 WatchSource:0}: Error finding container c51ca85208f1d4ed4b4ac8b8a8b39ab9d8b4e262803cd0e4a30b884d9a59c1a8: Status 404 returned error can't find the container with id c51ca85208f1d4ed4b4ac8b8a8b39ab9d8b4e262803cd0e4a30b884d9a59c1a8 May 11 20:50:23.401081 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:23.401056 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c5eff5275aeb9307128a4ad3171d6f0.slice/crio-4fba27b2c179384ab21dfc2c8ae1998234d6329c0163d455d13ff41732bb4853 WatchSource:0}: Error finding container 4fba27b2c179384ab21dfc2c8ae1998234d6329c0163d455d13ff41732bb4853: Status 404 returned error can't find the container with id 4fba27b2c179384ab21dfc2c8ae1998234d6329c0163d455d13ff41732bb4853 May 11 20:50:23.404022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.404008 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 20:50:23.416844 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.416815 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cc55p" May 11 20:50:23.423726 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.423707 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cc55p" May 11 20:50:23.456259 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:23.456231 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.476990 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.476934 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" event={"ID":"2c5eff5275aeb9307128a4ad3171d6f0","Type":"ContainerStarted","Data":"4fba27b2c179384ab21dfc2c8ae1998234d6329c0163d455d13ff41732bb4853"} May 11 20:50:23.477825 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.477801 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" event={"ID":"617b24297d13d590f6c983d42c59bc7e","Type":"ContainerStarted","Data":"c51ca85208f1d4ed4b4ac8b8a8b39ab9d8b4e262803cd0e4a30b884d9a59c1a8"} May 11 20:50:23.557139 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:23.557117 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.657624 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:23.657560 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-58.ec2.internal\" not found" May 11 20:50:23.743430 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.743405 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:23.774670 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.774647 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" May 11 20:50:23.785408 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.785387 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 11 20:50:23.787651 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.787627 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" May 11 20:50:23.795877 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.795858 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 11 20:50:23.805328 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:23.805311 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:24.313083 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.313058 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:24.352017 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.351995 2567 apiserver.go:52] "Watching apiserver" May 11 20:50:24.359631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.359609 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" May 11 20:50:24.361861 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.361835 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88","openshift-image-registry/node-ca-cv7k6","openshift-multus/multus-5kw85","openshift-network-diagnostics/network-check-target-m9tgf","openshift-ovn-kubernetes/ovnkube-node-svtmh","kube-system/konnectivity-agent-k98kb","openshift-cluster-node-tuning-operator/tuned-x7k87","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal","openshift-multus/multus-additional-cni-plugins-l8r5d","openshift-multus/network-metrics-daemon-v9s7z","openshift-network-operator/iptables-alerter-4qb4b","kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal"] May 11 20:50:24.363718 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.363697 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.365199 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.365154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.366574 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.366206 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" May 11 20:50:24.366574 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.366418 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" May 11 20:50:24.366574 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.366426 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qjbs4\"" May 11 20:50:24.366574 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.366428 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" May 11 20:50:24.368447 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.367329 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5kw85" May 11 20:50:24.368447 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.367472 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" May 11 20:50:24.369512 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.369322 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:24.369512 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.369390 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:24.370139 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.370121 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" May 11 20:50:24.370228 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.370163 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" May 11 20:50:24.370354 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.370321 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wr8jp\"" May 11 20:50:24.370404 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.370384 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" May 11 20:50:24.371007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.370989 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" May 11 20:50:24.371214 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.371200 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" May 11 20:50:24.372149 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.372130 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pvrw2\"" May 11 20:50:24.372502 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.372347 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" May 11 20:50:24.374547 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.374526 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.374624 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.374566 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.375726 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.375707 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.377448 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.377427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.377530 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.377463 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" May 11 20:50:24.378069 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.378051 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9lrst\"" May 11 20:50:24.378157 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.378099 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" May 11 20:50:24.378819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.378554 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" May 11 20:50:24.378819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.378585 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2dzzq\"" May 11 20:50:24.378819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.378623 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" May 11 20:50:24.378819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.378629 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" May 11 20:50:24.379114 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:24.379171 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.379131 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:24.379235 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379167 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" May 11 20:50:24.379820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379336 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" May 11 20:50:24.379820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379365 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" May 11 20:50:24.379820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379442 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" May 11 20:50:24.379820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379580 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:24.379820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379641 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" May 11 20:50:24.379820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379654 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2hqlf\"" May 11 20:50:24.379820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379580 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qwj2j\"" May 11 20:50:24.380138 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.379836 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" May 11 20:50:24.381018 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.380954 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.383406 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383384 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-os-release\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.383510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383424 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovn-node-metrics-cert\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.383510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktdj\" (UniqueName: \"kubernetes.io/projected/3d7be993-4ba8-4b01-8fd3-d04162534cc5-kube-api-access-sktdj\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.383510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383471 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.383646 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383551 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b239754b-8d38-41b0-9290-744afb39226a-cni-binary-copy\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.383646 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-kubelet\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.383646 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-hostroot\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383652 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-systemd-units\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383680 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-var-lib-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383722 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383732 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-cni-netd\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383752 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383774 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:24.383803 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383772 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-registration-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383826 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-device-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383802 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kwdtn\"" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383854 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-sys-fs\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383879 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwh9l\" (UniqueName: \"kubernetes.io/projected/421ac0f8-3310-4cc2-a9bf-159e3293219a-kube-api-access-mwh9l\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383905 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-k8s-cni-cncf-io\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-kubelet\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383951 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-ovn\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.383990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-cni-bin\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384013 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-etc-selinux\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384032 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-host\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384046 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-cni-multus\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384065 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b239754b-8d38-41b0-9290-744afb39226a-multus-daemon-config\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384087 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-etc-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384110 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovnkube-config\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384130 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-cni-bin\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384151 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-conf-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384185 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwmm\" (UniqueName: \"kubernetes.io/projected/b239754b-8d38-41b0-9290-744afb39226a-kube-api-access-fzwmm\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384210 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-slash\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovnkube-script-lib\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384302 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-serviceca\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384326 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-cnibin\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384347 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-multus-certs\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-run-netns\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384393 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-env-overrides\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384416 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-system-cni-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-netns\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-etc-kubernetes\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384511 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384535 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-node-log\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dc7028e1-034b-4393-88d2-1dbb1e82cfe7-agent-certs\") pod \"konnectivity-agent-k98kb\" (UID: \"dc7028e1-034b-4393-88d2-1dbb1e82cfe7\") " pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384646 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-cni-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-socket-dir-parent\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.384872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-systemd\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.385699 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.385699 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384777 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-log-socket\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.385699 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384807 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dc7028e1-034b-4393-88d2-1dbb1e82cfe7-konnectivity-ca\") pod \"konnectivity-agent-k98kb\" (UID: \"dc7028e1-034b-4393-88d2-1dbb1e82cfe7\") " pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.385699 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384831 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-socket-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.385699 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.384857 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4t8\" (UniqueName: \"kubernetes.io/projected/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-kube-api-access-9p4t8\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.424320 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.424295 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-10 20:45:23 +0000 UTC" deadline="2027-10-31 01:51:26.525914636 +0000 UTC" May 11 20:50:24.424320 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.424319 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12893h1m2.101597745s" May 11 20:50:24.475786 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.475759 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 11 20:50:24.485355 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485322 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-kubelet\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.485462 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485357 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-hostroot\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.485462 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485381 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-var-lib-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485462 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485406 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-host\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.485462 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.485462 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-hostroot\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.485462 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485456 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-registration-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485462 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485439 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-var-lib-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-device-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485444 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-kubelet\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-sys-fs\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485542 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwh9l\" (UniqueName: \"kubernetes.io/projected/421ac0f8-3310-4cc2-a9bf-159e3293219a-kube-api-access-mwh9l\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485566 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-kubelet\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485579 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-device-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485589 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-ovn\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485579 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-registration-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485603 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-sys-fs\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485613 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-cni-bin\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485638 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovnkube-config\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485641 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-ovn\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485661 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-cni-bin\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485683 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-etc-selinux\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-kubelet\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-etc-selinux\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.485819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-cni-multus\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485783 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-run\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485807 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-lib-modules\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485814 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-cni-multus\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-system-cni-dir\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485882 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485912 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485938 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.485991 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-cni-bin\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-slash\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-modprobe-d\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486097 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dc57064a-f851-4169-a8e2-cf56733a9587-etc-tuned\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486122 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg4p4\" (UniqueName: \"kubernetes.io/projected/3800edc1-af00-418d-a5b8-d832cbe20fbf-kube-api-access-mg4p4\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486184 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovnkube-config\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-cnibin\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-multus-certs\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.486386 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-env-overrides\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486246 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-var-lib-kubelet\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486334 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-slash\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-cnibin\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486333 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-multus-certs\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486366 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-cnibin\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486381 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-var-lib-cni-bin\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-system-cni-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-netns\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-system-cni-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486518 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-netns\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-node-log\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-node-log\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-env-overrides\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486595 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-systemd\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486633 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-sys\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.487071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486660 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-socket-dir-parent\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486697 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-socket-dir-parent\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-log-socket\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sktdj\" (UniqueName: \"kubernetes.io/projected/3d7be993-4ba8-4b01-8fd3-d04162534cc5-kube-api-access-sktdj\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-log-socket\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486773 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3270b7f8-5595-41cf-ba47-4115ee413da0-host-slash\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dc7028e1-034b-4393-88d2-1dbb1e82cfe7-konnectivity-ca\") pod \"konnectivity-agent-k98kb\" (UID: \"dc7028e1-034b-4393-88d2-1dbb1e82cfe7\") " pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486812 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-os-release\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486843 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b239754b-8d38-41b0-9290-744afb39226a-cni-binary-copy\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-systemd-units\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486894 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-cni-netd\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-os-release\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486925 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-cni-netd\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.487872 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-systemd-units\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.486975 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc57064a-f851-4169-a8e2-cf56733a9587-tmp\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm6z4\" (UniqueName: \"kubernetes.io/projected/dc57064a-f851-4169-a8e2-cf56733a9587-kube-api-access-fm6z4\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c46dj\" (UniqueName: \"kubernetes.io/projected/3eb3a067-139c-450e-b053-3f1a84abc363-kube-api-access-c46dj\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-k8s-cni-cncf-io\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysctl-d\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487115 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-os-release\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487151 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-host\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-host-run-k8s-cni-cncf-io\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487208 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b239754b-8d38-41b0-9290-744afb39226a-multus-daemon-config\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487241 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-host\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-etc-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487312 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dc7028e1-034b-4393-88d2-1dbb1e82cfe7-konnectivity-ca\") pod \"konnectivity-agent-k98kb\" (UID: \"dc7028e1-034b-4393-88d2-1dbb1e82cfe7\") " pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487320 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-etc-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487372 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-conf-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b239754b-8d38-41b0-9290-744afb39226a-cni-binary-copy\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwmm\" (UniqueName: \"kubernetes.io/projected/b239754b-8d38-41b0-9290-744afb39226a-kube-api-access-fzwmm\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.488575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovnkube-script-lib\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-conf-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-serviceca\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-run-netns\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487493 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-etc-kubernetes\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487519 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-kubernetes\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysctl-conf\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-cni-binary-copy\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487595 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dc7028e1-034b-4393-88d2-1dbb1e82cfe7-agent-certs\") pod \"konnectivity-agent-k98kb\" (UID: \"dc7028e1-034b-4393-88d2-1dbb1e82cfe7\") " pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487628 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-etc-kubernetes\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-cni-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-systemd\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487695 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-host-run-netns\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysconfig\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487797 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3270b7f8-5595-41cf-ba47-4115ee413da0-iptables-alerter-script\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487815 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b239754b-8d38-41b0-9290-744afb39226a-multus-daemon-config\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.489144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487848 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-socket-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487883 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4t8\" (UniqueName: \"kubernetes.io/projected/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-kube-api-access-9p4t8\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487914 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovn-node-metrics-cert\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487950 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/421ac0f8-3310-4cc2-a9bf-159e3293219a-socket-dir\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487947 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpj6b\" (UniqueName: \"kubernetes.io/projected/3270b7f8-5595-41cf-ba47-4115ee413da0-kube-api-access-xpj6b\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.487998 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovnkube-script-lib\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.488033 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b239754b-8d38-41b0-9290-744afb39226a-multus-cni-dir\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.488022 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.488055 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-serviceca\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.488168 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-openvswitch\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.489703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.488237 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7be993-4ba8-4b01-8fd3-d04162534cc5-run-systemd\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.490863 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.490838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7be993-4ba8-4b01-8fd3-d04162534cc5-ovn-node-metrics-cert\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.490950 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.490924 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dc7028e1-034b-4393-88d2-1dbb1e82cfe7-agent-certs\") pod \"konnectivity-agent-k98kb\" (UID: \"dc7028e1-034b-4393-88d2-1dbb1e82cfe7\") " pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.492008 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.491989 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:24.492008 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.492010 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:24.492168 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.492020 2567 projected.go:194] Error preparing data for projected volume kube-api-access-cbq6v for pod openshift-network-diagnostics/network-check-target-m9tgf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:24.492168 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.492087 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v podName:3a2d13ea-d235-437e-9668-e21aca93682a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:24.992058024 +0000 UTC m=+3.101838469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cbq6v" (UniqueName: "kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v") pod "network-check-target-m9tgf" (UID: "3a2d13ea-d235-437e-9668-e21aca93682a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:24.494521 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.494499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktdj\" (UniqueName: \"kubernetes.io/projected/3d7be993-4ba8-4b01-8fd3-d04162534cc5-kube-api-access-sktdj\") pod \"ovnkube-node-svtmh\" (UID: \"3d7be993-4ba8-4b01-8fd3-d04162534cc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.494939 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.494902 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwh9l\" (UniqueName: \"kubernetes.io/projected/421ac0f8-3310-4cc2-a9bf-159e3293219a-kube-api-access-mwh9l\") pod \"aws-ebs-csi-driver-node-khp88\" (UID: \"421ac0f8-3310-4cc2-a9bf-159e3293219a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.495346 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.495322 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4t8\" (UniqueName: \"kubernetes.io/projected/dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96-kube-api-access-9p4t8\") pod \"node-ca-cv7k6\" (UID: \"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96\") " pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.496044 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.496026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwmm\" (UniqueName: \"kubernetes.io/projected/b239754b-8d38-41b0-9290-744afb39226a-kube-api-access-fzwmm\") pod \"multus-5kw85\" (UID: \"b239754b-8d38-41b0-9290-744afb39226a\") " pod="openshift-multus/multus-5kw85" May 11 20:50:24.588991 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.588900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-var-lib-kubelet\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.588991 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.588943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-cnibin\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.588998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-systemd\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-sys\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589035 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-var-lib-kubelet\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589051 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3270b7f8-5595-41cf-ba47-4115ee413da0-host-slash\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589104 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-sys\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc57064a-f851-4169-a8e2-cf56733a9587-tmp\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589130 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-cnibin\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fm6z4\" (UniqueName: \"kubernetes.io/projected/dc57064a-f851-4169-a8e2-cf56733a9587-kube-api-access-fm6z4\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589130 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-systemd\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3270b7f8-5595-41cf-ba47-4115ee413da0-host-slash\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.589197 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589172 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c46dj\" (UniqueName: \"kubernetes.io/projected/3eb3a067-139c-450e-b053-3f1a84abc363-kube-api-access-c46dj\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589208 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysctl-d\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-os-release\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589263 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-kubernetes\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysctl-conf\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589337 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-cni-binary-copy\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589367 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysconfig\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589389 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3270b7f8-5595-41cf-ba47-4115ee413da0-iptables-alerter-script\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-os-release\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysctl-conf\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589565 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysctl-d\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-kubernetes\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589622 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpj6b\" (UniqueName: \"kubernetes.io/projected/3270b7f8-5595-41cf-ba47-4115ee413da0-kube-api-access-xpj6b\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589655 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-host\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589661 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-sysconfig\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.589701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589718 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-run\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-lib-modules\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-system-cni-dir\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589814 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589841 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-modprobe-d\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589889 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-system-cni-dir\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-cni-binary-copy\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dc57064a-f851-4169-a8e2-cf56733a9587-etc-tuned\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3270b7f8-5595-41cf-ba47-4115ee413da0-iptables-alerter-script\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-run\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.589985 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg4p4\" (UniqueName: \"kubernetes.io/projected/3800edc1-af00-418d-a5b8-d832cbe20fbf-kube-api-access-mg4p4\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.590046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-lib-modules\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.590098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3eb3a067-139c-450e-b053-3f1a84abc363-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.590181 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:24.590509 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.590222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.591299 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.590253 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs podName:3800edc1-af00-418d-a5b8-d832cbe20fbf nodeName:}" failed. No retries permitted until 2026-05-11 20:50:25.090233512 +0000 UTC m=+3.200013953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs") pod "network-metrics-daemon-v9s7z" (UID: "3800edc1-af00-418d-a5b8-d832cbe20fbf") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:24.591299 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.590287 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-host\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.591299 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.590345 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dc57064a-f851-4169-a8e2-cf56733a9587-etc-modprobe-d\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.591299 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.590402 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3eb3a067-139c-450e-b053-3f1a84abc363-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.592527 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.592504 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc57064a-f851-4169-a8e2-cf56733a9587-tmp\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.592865 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.592843 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dc57064a-f851-4169-a8e2-cf56733a9587-etc-tuned\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.597301 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.597232 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpj6b\" (UniqueName: \"kubernetes.io/projected/3270b7f8-5595-41cf-ba47-4115ee413da0-kube-api-access-xpj6b\") pod \"iptables-alerter-4qb4b\" (UID: \"3270b7f8-5595-41cf-ba47-4115ee413da0\") " pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.599245 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.599222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm6z4\" (UniqueName: \"kubernetes.io/projected/dc57064a-f851-4169-a8e2-cf56733a9587-kube-api-access-fm6z4\") pod \"tuned-x7k87\" (UID: \"dc57064a-f851-4169-a8e2-cf56733a9587\") " pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.599340 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.599322 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg4p4\" (UniqueName: \"kubernetes.io/projected/3800edc1-af00-418d-a5b8-d832cbe20fbf-kube-api-access-mg4p4\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:24.599383 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.599371 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c46dj\" (UniqueName: \"kubernetes.io/projected/3eb3a067-139c-450e-b053-3f1a84abc363-kube-api-access-c46dj\") pod \"multus-additional-cni-plugins-l8r5d\" (UID: \"3eb3a067-139c-450e-b053-3f1a84abc363\") " pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.677020 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.676994 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" May 11 20:50:24.685602 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.685581 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cv7k6" May 11 20:50:24.693316 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.693296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5kw85" May 11 20:50:24.698992 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.698973 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:24.707245 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.707228 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:24.713792 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.713777 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x7k87" May 11 20:50:24.721325 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.721308 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" May 11 20:50:24.727797 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.727780 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4qb4b" May 11 20:50:24.786888 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.786865 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:24.993220 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:24.993192 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:24.993348 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.993332 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:24.993402 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.993354 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:24.993402 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.993364 2567 projected.go:194] Error preparing data for projected volume kube-api-access-cbq6v for pod openshift-network-diagnostics/network-check-target-m9tgf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:24.993500 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:24.993410 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v podName:3a2d13ea-d235-437e-9668-e21aca93682a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:25.993397394 +0000 UTC m=+4.103177817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbq6v" (UniqueName: "kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v") pod "network-check-target-m9tgf" (UID: "3a2d13ea-d235-437e-9668-e21aca93682a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:24.996112 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:24.996081 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb3a067_139c_450e_b053_3f1a84abc363.slice/crio-c3afa92e39341d18c9606ab3a19320a21f5919fdbb0c610a80d9fb3a999d9ac4 WatchSource:0}: Error finding container c3afa92e39341d18c9606ab3a19320a21f5919fdbb0c610a80d9fb3a999d9ac4: Status 404 returned error can't find the container with id c3afa92e39341d18c9606ab3a19320a21f5919fdbb0c610a80d9fb3a999d9ac4 May 11 20:50:24.997346 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:24.997291 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea6f3bb_bb99_4e25_8cf5_1aca4ea1ed96.slice/crio-941cfd56cb6b4b0445cc1ce460053ea0671a64a31d154902bdabd075a0524874 WatchSource:0}: Error finding container 941cfd56cb6b4b0445cc1ce460053ea0671a64a31d154902bdabd075a0524874: Status 404 returned error can't find the container with id 941cfd56cb6b4b0445cc1ce460053ea0671a64a31d154902bdabd075a0524874 May 11 20:50:25.000920 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:25.000902 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc7028e1_034b_4393_88d2_1dbb1e82cfe7.slice/crio-ace6895ef432e1fdf46842e73bdb9ade113de92b0a5676918da3a09cc84e696d WatchSource:0}: Error finding container ace6895ef432e1fdf46842e73bdb9ade113de92b0a5676918da3a09cc84e696d: Status 404 returned error can't find the container with id ace6895ef432e1fdf46842e73bdb9ade113de92b0a5676918da3a09cc84e696d May 11 20:50:25.001176 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:25.001163 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d7be993_4ba8_4b01_8fd3_d04162534cc5.slice/crio-f4ac5a2fdedbcd079fb2e76aed5b58d6c9e13142703ee9c4b5522b80d0c0d50f WatchSource:0}: Error finding container f4ac5a2fdedbcd079fb2e76aed5b58d6c9e13142703ee9c4b5522b80d0c0d50f: Status 404 returned error can't find the container with id f4ac5a2fdedbcd079fb2e76aed5b58d6c9e13142703ee9c4b5522b80d0c0d50f May 11 20:50:25.001993 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:25.001921 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb239754b_8d38_41b0_9290_744afb39226a.slice/crio-eea8a825c9466437aebb2a161f9fb867c8950f185155048f30390bdb8f8ff18c WatchSource:0}: Error finding container eea8a825c9466437aebb2a161f9fb867c8950f185155048f30390bdb8f8ff18c: Status 404 returned error can't find the container with id eea8a825c9466437aebb2a161f9fb867c8950f185155048f30390bdb8f8ff18c May 11 20:50:25.003778 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:25.003761 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc57064a_f851_4169_a8e2_cf56733a9587.slice/crio-cce616dda5b1b4ad89bc0d88798d500a87f0e10e01fd106f5f92d0215d67d522 WatchSource:0}: Error finding container cce616dda5b1b4ad89bc0d88798d500a87f0e10e01fd106f5f92d0215d67d522: Status 404 returned error can't find the container with id cce616dda5b1b4ad89bc0d88798d500a87f0e10e01fd106f5f92d0215d67d522 May 11 20:50:25.094125 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.094102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:25.094238 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:25.094220 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:25.094286 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:25.094274 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs podName:3800edc1-af00-418d-a5b8-d832cbe20fbf nodeName:}" failed. No retries permitted until 2026-05-11 20:50:26.094258177 +0000 UTC m=+4.204038600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs") pod "network-metrics-daemon-v9s7z" (UID: "3800edc1-af00-418d-a5b8-d832cbe20fbf") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:25.309501 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.309476 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wdhpf"] May 11 20:50:25.311273 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.311249 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.314015 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.313993 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qvmvl\"" May 11 20:50:25.314766 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.314745 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" May 11 20:50:25.314854 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.314786 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" May 11 20:50:25.395851 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.395650 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82bd4\" (UniqueName: \"kubernetes.io/projected/8981c6f1-07ce-4ebe-9071-6caf7218306a-kube-api-access-82bd4\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.395851 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.395695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8981c6f1-07ce-4ebe-9071-6caf7218306a-tmp-dir\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.395851 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.395734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8981c6f1-07ce-4ebe-9071-6caf7218306a-hosts-file\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.425122 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.425058 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-10 20:45:23 +0000 UTC" deadline="2027-11-24 23:04:01.710260291 +0000 UTC" May 11 20:50:25.425122 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.425090 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13490h13m36.285173684s" May 11 20:50:25.484139 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.483466 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" event={"ID":"2c5eff5275aeb9307128a4ad3171d6f0","Type":"ContainerStarted","Data":"8ca58e741899869422d97c6499151d9dfce0ea754ff5f7a6627363098614bf45"} May 11 20:50:25.492753 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.492174 2567 generic.go:358] "Generic (PLEG): container finished" podID="617b24297d13d590f6c983d42c59bc7e" containerID="18da2f10522064dbf8c9de4f7078d45294217474ab4b8d9135049d51b3fbe64e" exitCode=0 May 11 20:50:25.492753 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.492252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" event={"ID":"617b24297d13d590f6c983d42c59bc7e","Type":"ContainerDied","Data":"18da2f10522064dbf8c9de4f7078d45294217474ab4b8d9135049d51b3fbe64e"} May 11 20:50:25.496774 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.496699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8981c6f1-07ce-4ebe-9071-6caf7218306a-hosts-file\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.496878 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.496789 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82bd4\" (UniqueName: \"kubernetes.io/projected/8981c6f1-07ce-4ebe-9071-6caf7218306a-kube-api-access-82bd4\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.496878 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.496820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8981c6f1-07ce-4ebe-9071-6caf7218306a-tmp-dir\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.497002 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.496980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8981c6f1-07ce-4ebe-9071-6caf7218306a-hosts-file\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.497334 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.497314 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8981c6f1-07ce-4ebe-9071-6caf7218306a-tmp-dir\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.503804 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.503763 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x7k87" event={"ID":"dc57064a-f851-4169-a8e2-cf56733a9587","Type":"ContainerStarted","Data":"cce616dda5b1b4ad89bc0d88798d500a87f0e10e01fd106f5f92d0215d67d522"} May 11 20:50:25.507467 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.507393 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-58.ec2.internal" podStartSLOduration=2.507376947 podStartE2EDuration="2.507376947s" podCreationTimestamp="2026-05-11 20:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:50:25.496062076 +0000 UTC m=+3.605842524" watchObservedRunningTime="2026-05-11 20:50:25.507376947 +0000 UTC m=+3.617157393" May 11 20:50:25.509620 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.509568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"f4ac5a2fdedbcd079fb2e76aed5b58d6c9e13142703ee9c4b5522b80d0c0d50f"} May 11 20:50:25.510929 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.510885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82bd4\" (UniqueName: \"kubernetes.io/projected/8981c6f1-07ce-4ebe-9071-6caf7218306a-kube-api-access-82bd4\") pod \"node-resolver-wdhpf\" (UID: \"8981c6f1-07ce-4ebe-9071-6caf7218306a\") " pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:25.512277 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.512219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k98kb" event={"ID":"dc7028e1-034b-4393-88d2-1dbb1e82cfe7","Type":"ContainerStarted","Data":"ace6895ef432e1fdf46842e73bdb9ade113de92b0a5676918da3a09cc84e696d"} May 11 20:50:25.523531 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.523454 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cv7k6" event={"ID":"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96","Type":"ContainerStarted","Data":"941cfd56cb6b4b0445cc1ce460053ea0671a64a31d154902bdabd075a0524874"} May 11 20:50:25.527903 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.527860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerStarted","Data":"c3afa92e39341d18c9606ab3a19320a21f5919fdbb0c610a80d9fb3a999d9ac4"} May 11 20:50:25.529611 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.529561 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4qb4b" event={"ID":"3270b7f8-5595-41cf-ba47-4115ee413da0","Type":"ContainerStarted","Data":"81043b3f2a6a23f64c34d5fa6dfc7a3e7f95254e44edf6907969222847b4b502"} May 11 20:50:25.533241 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.533179 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" event={"ID":"421ac0f8-3310-4cc2-a9bf-159e3293219a","Type":"ContainerStarted","Data":"fba536d49e90f41472ae2d6e96f79a481f19b8c75496884c899ce4bb7f501507"} May 11 20:50:25.536555 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.536524 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5kw85" event={"ID":"b239754b-8d38-41b0-9290-744afb39226a","Type":"ContainerStarted","Data":"eea8a825c9466437aebb2a161f9fb867c8950f185155048f30390bdb8f8ff18c"} May 11 20:50:25.633698 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:25.633663 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wdhpf" May 11 20:50:26.000829 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:26.000793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:26.001038 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.000946 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:26.001038 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.000992 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:26.001038 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.001007 2567 projected.go:194] Error preparing data for projected volume kube-api-access-cbq6v for pod openshift-network-diagnostics/network-check-target-m9tgf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:26.001211 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.001071 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v podName:3a2d13ea-d235-437e-9668-e21aca93682a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:28.001051704 +0000 UTC m=+6.110832140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbq6v" (UniqueName: "kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v") pod "network-check-target-m9tgf" (UID: "3a2d13ea-d235-437e-9668-e21aca93682a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:26.101394 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:26.101359 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:26.101593 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.101494 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:26.101593 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.101556 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs podName:3800edc1-af00-418d-a5b8-d832cbe20fbf nodeName:}" failed. No retries permitted until 2026-05-11 20:50:28.101539365 +0000 UTC m=+6.211319791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs") pod "network-metrics-daemon-v9s7z" (UID: "3800edc1-af00-418d-a5b8-d832cbe20fbf") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:26.475595 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:26.475562 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:26.476073 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.475703 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:26.476146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:26.476075 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:26.476210 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:26.476165 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:26.565336 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:26.565300 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" event={"ID":"617b24297d13d590f6c983d42c59bc7e","Type":"ContainerStarted","Data":"54bd79269b0d9311c65e98acc36f1165517893ca520394740860d00ad750282c"} May 11 20:50:26.578927 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:26.578877 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-58.ec2.internal" podStartSLOduration=3.578861204 podStartE2EDuration="3.578861204s" podCreationTimestamp="2026-05-11 20:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:50:26.577097493 +0000 UTC m=+4.686877961" watchObservedRunningTime="2026-05-11 20:50:26.578861204 +0000 UTC m=+4.688641652" May 11 20:50:26.581276 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:26.581250 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wdhpf" event={"ID":"8981c6f1-07ce-4ebe-9071-6caf7218306a","Type":"ContainerStarted","Data":"f356027596190f8eeb1df605531e976c08afbf3628a7ce2ae8edbfba693f442f"} May 11 20:50:28.024630 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:28.024586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:28.025123 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.024753 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:28.025123 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.024773 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:28.025123 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.024785 2567 projected.go:194] Error preparing data for projected volume kube-api-access-cbq6v for pod openshift-network-diagnostics/network-check-target-m9tgf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:28.025123 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.024843 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v podName:3a2d13ea-d235-437e-9668-e21aca93682a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:32.024823401 +0000 UTC m=+10.134603906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbq6v" (UniqueName: "kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v") pod "network-check-target-m9tgf" (UID: "3a2d13ea-d235-437e-9668-e21aca93682a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:28.125955 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:28.125339 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:28.125955 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.125560 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:28.125955 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.125623 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs podName:3800edc1-af00-418d-a5b8-d832cbe20fbf nodeName:}" failed. No retries permitted until 2026-05-11 20:50:32.125604556 +0000 UTC m=+10.235384986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs") pod "network-metrics-daemon-v9s7z" (UID: "3800edc1-af00-418d-a5b8-d832cbe20fbf") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:28.475918 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:28.475302 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:28.475918 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.475444 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:28.475918 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:28.475792 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:28.475918 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:28.475884 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:30.475469 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:30.474861 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:30.475469 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:30.474916 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:30.475469 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:30.475024 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:30.475469 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:30.475426 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:32.057809 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:32.057769 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:32.058269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.057943 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:32.058269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.057984 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:32.058269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.057998 2567 projected.go:194] Error preparing data for projected volume kube-api-access-cbq6v for pod openshift-network-diagnostics/network-check-target-m9tgf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:32.058269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.058056 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v podName:3a2d13ea-d235-437e-9668-e21aca93682a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:40.058037735 +0000 UTC m=+18.167818161 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbq6v" (UniqueName: "kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v") pod "network-check-target-m9tgf" (UID: "3a2d13ea-d235-437e-9668-e21aca93682a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:32.158337 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:32.158299 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:32.158504 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.158468 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:32.158558 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.158548 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs podName:3800edc1-af00-418d-a5b8-d832cbe20fbf nodeName:}" failed. No retries permitted until 2026-05-11 20:50:40.158526173 +0000 UTC m=+18.268306597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs") pod "network-metrics-daemon-v9s7z" (UID: "3800edc1-af00-418d-a5b8-d832cbe20fbf") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:32.476103 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:32.475423 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:32.476103 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.475527 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:32.476103 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:32.475888 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:32.476103 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:32.476005 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:34.475443 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:34.475409 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:34.475911 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:34.475419 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:34.475911 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:34.475535 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:34.475911 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:34.475618 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:36.474584 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:36.474550 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:36.475033 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:36.474560 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:36.475033 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:36.474670 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:36.475033 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:36.474766 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:38.474814 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:38.474582 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:38.475212 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:38.474917 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:38.475212 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:38.474635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:38.475212 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:38.475105 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:40.116368 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:40.116316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:40.116874 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.116493 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:40.116874 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.116519 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:40.116874 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.116531 2567 projected.go:194] Error preparing data for projected volume kube-api-access-cbq6v for pod openshift-network-diagnostics/network-check-target-m9tgf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:40.116874 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.116581 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v podName:3a2d13ea-d235-437e-9668-e21aca93682a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:56.116567153 +0000 UTC m=+34.226347576 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbq6v" (UniqueName: "kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v") pod "network-check-target-m9tgf" (UID: "3a2d13ea-d235-437e-9668-e21aca93682a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:40.216876 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:40.216826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:40.217046 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.216991 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:40.217103 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.217083 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs podName:3800edc1-af00-418d-a5b8-d832cbe20fbf nodeName:}" failed. No retries permitted until 2026-05-11 20:50:56.217066028 +0000 UTC m=+34.326846454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs") pod "network-metrics-daemon-v9s7z" (UID: "3800edc1-af00-418d-a5b8-d832cbe20fbf") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:40.475370 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:40.475282 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:40.475370 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:40.475329 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:40.475574 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.475399 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:40.475574 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:40.475522 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:42.475857 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:42.475827 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:42.476237 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:42.475904 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:42.476237 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:42.475999 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:42.476237 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:42.476092 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:43.612699 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.612665 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x7k87" event={"ID":"dc57064a-f851-4169-a8e2-cf56733a9587","Type":"ContainerStarted","Data":"f49e0e7a0eedf5b3201c36aa80f6def5a8737c313095b754719381ffa088f16c"} May 11 20:50:43.614806 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.614783 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 20:50:43.615151 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.615128 2567 generic.go:358] "Generic (PLEG): container finished" podID="3d7be993-4ba8-4b01-8fd3-d04162534cc5" containerID="f6580ac2f69b72a0237910eae5666bf275cdc18ad119c624145bbb52cebe8b55" exitCode=1 May 11 20:50:43.615235 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.615194 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"32f4a26be5f85605bc174a45ba0d64a32fe5fc5a6bf79bb2b55a19d6b7909ea4"} May 11 20:50:43.615235 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.615212 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"6c43909f32d71ec14531ce7941b82d03cf54e22eaf98b4c0079e8c75ea69a6c4"} May 11 20:50:43.615235 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.615220 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerDied","Data":"f6580ac2f69b72a0237910eae5666bf275cdc18ad119c624145bbb52cebe8b55"} May 11 20:50:43.615235 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.615231 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"f482429094c9cd648f25570121ed687b08d0fd380c9a05c96f4a691b051873b7"} May 11 20:50:43.616624 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.616605 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-k98kb" event={"ID":"dc7028e1-034b-4393-88d2-1dbb1e82cfe7","Type":"ContainerStarted","Data":"461083c16737b32997f05ff9a5e1e9a0c7c034649adfbbe8f710114232e54fe6"} May 11 20:50:43.618064 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.618030 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cv7k6" event={"ID":"dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96","Type":"ContainerStarted","Data":"9678a07b485a31e4157ff63972ca509f403dfb64d927ffc6786f682154238499"} May 11 20:50:43.619446 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.619425 2567 generic.go:358] "Generic (PLEG): container finished" podID="3eb3a067-139c-450e-b053-3f1a84abc363" containerID="29a53796bffd680026e9a239b2d61a93af9a692a9bc65e38517c70507314cd66" exitCode=0 May 11 20:50:43.619541 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.619482 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerDied","Data":"29a53796bffd680026e9a239b2d61a93af9a692a9bc65e38517c70507314cd66"} May 11 20:50:43.621153 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.621127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" event={"ID":"421ac0f8-3310-4cc2-a9bf-159e3293219a","Type":"ContainerStarted","Data":"37e557024a812de1a0037ed0b6ff392fb69e421ae9c1353e45d6c2dd60229ed9"} May 11 20:50:43.622559 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.622514 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5kw85" event={"ID":"b239754b-8d38-41b0-9290-744afb39226a","Type":"ContainerStarted","Data":"b848e975137758358313cf2ba010e49b509bbf32f762669f9bef6974b254548b"} May 11 20:50:43.624189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.624166 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wdhpf" event={"ID":"8981c6f1-07ce-4ebe-9071-6caf7218306a","Type":"ContainerStarted","Data":"fff638077449cdb933cba0fc8c4de13133de5008338a4bf72b138beb375bcd6a"} May 11 20:50:43.628651 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.628613 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x7k87" podStartSLOduration=3.741169376 podStartE2EDuration="21.628599989s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:25.008740965 +0000 UTC m=+3.118521391" lastFinishedPulling="2026-05-11 20:50:42.896171576 +0000 UTC m=+21.005952004" observedRunningTime="2026-05-11 20:50:43.627993238 +0000 UTC m=+21.737773684" watchObservedRunningTime="2026-05-11 20:50:43.628599989 +0000 UTC m=+21.738380435" May 11 20:50:43.645366 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.645324 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5kw85" podStartSLOduration=3.719443736 podStartE2EDuration="21.645310073s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:25.007490367 +0000 UTC m=+3.117270791" lastFinishedPulling="2026-05-11 20:50:42.933356705 +0000 UTC m=+21.043137128" observedRunningTime="2026-05-11 20:50:43.644375221 +0000 UTC m=+21.754155665" watchObservedRunningTime="2026-05-11 20:50:43.645310073 +0000 UTC m=+21.755090520" May 11 20:50:43.665651 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.665616 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-k98kb" podStartSLOduration=7.734983762 podStartE2EDuration="21.665606462s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:25.002243397 +0000 UTC m=+3.112023820" lastFinishedPulling="2026-05-11 20:50:38.932866096 +0000 UTC m=+17.042646520" observedRunningTime="2026-05-11 20:50:43.66542613 +0000 UTC m=+21.775206575" watchObservedRunningTime="2026-05-11 20:50:43.665606462 +0000 UTC m=+21.775386907" May 11 20:50:43.701249 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.701207 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wdhpf" podStartSLOduration=5.454945692 podStartE2EDuration="18.701191502s" podCreationTimestamp="2026-05-11 20:50:25 +0000 UTC" firstStartedPulling="2026-05-11 20:50:25.686619327 +0000 UTC m=+3.796399757" lastFinishedPulling="2026-05-11 20:50:38.932865144 +0000 UTC m=+17.042645567" observedRunningTime="2026-05-11 20:50:43.700691966 +0000 UTC m=+21.810472437" watchObservedRunningTime="2026-05-11 20:50:43.701191502 +0000 UTC m=+21.810971950" May 11 20:50:43.714573 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:43.714530 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cv7k6" podStartSLOduration=3.819152938 podStartE2EDuration="21.714519328s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:24.999123777 +0000 UTC m=+3.108904203" lastFinishedPulling="2026-05-11 20:50:42.894490163 +0000 UTC m=+21.004270593" observedRunningTime="2026-05-11 20:50:43.7142446 +0000 UTC m=+21.824025250" watchObservedRunningTime="2026-05-11 20:50:43.714519328 +0000 UTC m=+21.824299773" May 11 20:50:44.135048 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.134847 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" May 11 20:50:44.441647 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.441540 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-05-11T20:50:44.135042485Z","UUID":"b824d27e-bffa-43ae-9e11-7a4543b8c710","Handler":null,"Name":"","Endpoint":""} May 11 20:50:44.443190 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.443167 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 May 11 20:50:44.443322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.443219 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock May 11 20:50:44.474768 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.474745 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:44.474898 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:44.474856 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:44.475109 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.475087 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:44.475195 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:44.475175 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:44.630023 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.629993 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 20:50:44.630573 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.630423 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"093ee1e600dfd043e7812b9ebdd8664f7d16e2566755be83a2874f290e0c9953"} May 11 20:50:44.630573 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.630453 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"0ff3d905000e5d81838deff259fa971b7dafad710059a7b574b151c8a9514be8"} May 11 20:50:44.632294 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.632244 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4qb4b" event={"ID":"3270b7f8-5595-41cf-ba47-4115ee413da0","Type":"ContainerStarted","Data":"bbb693044bae81ab4f694283d9f66dd6ebd892ab46f73d6a8947fb58f528f700"} May 11 20:50:44.634296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.634221 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" event={"ID":"421ac0f8-3310-4cc2-a9bf-159e3293219a","Type":"ContainerStarted","Data":"404115a78de0039344f80a6723fa86f10f1fd5ed7c088486b4bbaf33397a6093"} May 11 20:50:44.669764 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:44.669717 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4qb4b" podStartSLOduration=5.181912766 podStartE2EDuration="22.669702654s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:25.008626429 +0000 UTC m=+3.118406855" lastFinishedPulling="2026-05-11 20:50:42.496416315 +0000 UTC m=+20.606196743" observedRunningTime="2026-05-11 20:50:44.669480608 +0000 UTC m=+22.779261052" watchObservedRunningTime="2026-05-11 20:50:44.669702654 +0000 UTC m=+22.779483102" May 11 20:50:45.637619 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:45.637583 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" event={"ID":"421ac0f8-3310-4cc2-a9bf-159e3293219a","Type":"ContainerStarted","Data":"13b88b74d7dfd23145baea4c60254e7f5fdbf9aec6fec5d4902993c3aef004ec"} May 11 20:50:45.654697 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:45.654648 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khp88" podStartSLOduration=3.389287794 podStartE2EDuration="23.654635289s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:25.008757644 +0000 UTC m=+3.118538078" lastFinishedPulling="2026-05-11 20:50:45.27410514 +0000 UTC m=+23.383885573" observedRunningTime="2026-05-11 20:50:45.65448278 +0000 UTC m=+23.764263222" watchObservedRunningTime="2026-05-11 20:50:45.654635289 +0000 UTC m=+23.764415735" May 11 20:50:46.475159 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:46.475119 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:46.475333 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:46.475119 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:46.475333 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:46.475255 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:46.475333 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:46.475313 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:46.643274 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:46.643243 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 20:50:46.643805 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:46.643626 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"d5cf7457ff3ce4a731c1eca573f8fa340bdb6dfe750117e47e96fb58c5dcc307"} May 11 20:50:48.173122 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.172848 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:48.173655 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.173636 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:48.474742 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.474718 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:48.474889 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.474717 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:48.474889 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:48.474813 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:48.474889 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:48.474877 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:48.648259 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.648230 2567 generic.go:358] "Generic (PLEG): container finished" podID="3eb3a067-139c-450e-b053-3f1a84abc363" containerID="4c4ff53f60fee0c07876de510e7676be526b223f6f3c065bf99e0fd1eda6296a" exitCode=0 May 11 20:50:48.648422 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.648321 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerDied","Data":"4c4ff53f60fee0c07876de510e7676be526b223f6f3c065bf99e0fd1eda6296a"} May 11 20:50:48.651268 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.651249 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 20:50:48.651613 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.651591 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"88c09028b327de9a20ea4e70eda652c6aaceed6430c5cec28cdfb9df8ca1395e"} May 11 20:50:48.651838 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.651818 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:48.651981 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.651950 2567 scope.go:117] "RemoveContainer" containerID="f6580ac2f69b72a0237910eae5666bf275cdc18ad119c624145bbb52cebe8b55" May 11 20:50:48.667199 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:48.667180 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:49.210030 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.209997 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:49.210590 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.210566 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-k98kb" May 11 20:50:49.657236 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.657209 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 20:50:49.657618 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.657589 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" event={"ID":"3d7be993-4ba8-4b01-8fd3-d04162534cc5","Type":"ContainerStarted","Data":"3197d17758ef35bd3d8cae62c7348fa0ca9ab9755f167bbdf34e2f88618bdeb9"} May 11 20:50:49.657840 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.657826 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 11 20:50:49.658087 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.658068 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:49.672784 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.672764 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:49.684791 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:49.684755 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" podStartSLOduration=9.764166459 podStartE2EDuration="27.684743934s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:25.003977508 +0000 UTC m=+3.113757932" lastFinishedPulling="2026-05-11 20:50:42.924554984 +0000 UTC m=+21.034335407" observedRunningTime="2026-05-11 20:50:49.684360545 +0000 UTC m=+27.794140992" watchObservedRunningTime="2026-05-11 20:50:49.684743934 +0000 UTC m=+27.794524379" May 11 20:50:50.016955 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:50.016752 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m9tgf"] May 11 20:50:50.017111 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:50.017012 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:50.017111 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:50.017087 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:50.019779 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:50.019759 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v9s7z"] May 11 20:50:50.019888 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:50.019844 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:50.019953 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:50.019915 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:50.505674 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:50.505646 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:50:50.660763 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:50.660730 2567 generic.go:358] "Generic (PLEG): container finished" podID="3eb3a067-139c-450e-b053-3f1a84abc363" containerID="547ae4601342b3edfd548a64cc76e307cce61c8a4b1486ce6eadbaa0ff10050b" exitCode=0 May 11 20:50:50.660915 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:50.660815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerDied","Data":"547ae4601342b3edfd548a64cc76e307cce61c8a4b1486ce6eadbaa0ff10050b"} May 11 20:50:51.475563 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:51.475534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:51.475743 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:51.475534 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:51.475743 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:51.475650 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:51.475743 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:51.475702 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:52.668000 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:52.667951 2567 generic.go:358] "Generic (PLEG): container finished" podID="3eb3a067-139c-450e-b053-3f1a84abc363" containerID="be61e652d4184ab5dbbf18ba20de20f213f7eb19cefd386ebe03991a16634e74" exitCode=0 May 11 20:50:52.668000 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:52.667995 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerDied","Data":"be61e652d4184ab5dbbf18ba20de20f213f7eb19cefd386ebe03991a16634e74"} May 11 20:50:53.475219 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:53.474951 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:53.475385 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:53.475011 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:53.475385 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:53.475319 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:53.475514 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:53.475423 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:55.474571 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.474541 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:55.474956 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.474540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:55.474956 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:55.474651 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v9s7z" podUID="3800edc1-af00-418d-a5b8-d832cbe20fbf" May 11 20:50:55.474956 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:55.474740 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m9tgf" podUID="3a2d13ea-d235-437e-9668-e21aca93682a" May 11 20:50:55.648351 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.648320 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-58.ec2.internal" event="NodeReady" May 11 20:50:55.648571 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.648556 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" May 11 20:50:55.700151 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.700115 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-695f55c9c8-p9zz5"] May 11 20:50:55.726289 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.726193 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-695f55c9c8-p9zz5"] May 11 20:50:55.726425 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.726371 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.728981 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.728934 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" May 11 20:50:55.729190 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.729084 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6vnrf\"" May 11 20:50:55.729190 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.729103 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" May 11 20:50:55.729527 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.729500 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" May 11 20:50:55.747791 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.747095 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" May 11 20:50:55.751227 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.751203 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-l8f8d"] May 11 20:50:55.770245 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.770222 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l8f8d"] May 11 20:50:55.770356 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.770342 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:55.772925 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.772898 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" May 11 20:50:55.788020 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.788000 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6l7gb"] May 11 20:50:55.832174 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.832130 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6l7gb"] May 11 20:50:55.832174 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.832165 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r2pks"] May 11 20:50:55.832379 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.832292 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6l7gb" May 11 20:50:55.835215 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.835196 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" May 11 20:50:55.835323 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.835195 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" May 11 20:50:55.835834 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.835791 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wsq67\"" May 11 20:50:55.846248 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.846230 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2pks"] May 11 20:50:55.846339 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.846327 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:55.848476 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.848578 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-certificates\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.848638 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848574 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-trusted-ca\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.848638 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848605 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-image-registry-private-configuration\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.848719 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-installation-pull-secrets\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.848719 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848707 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-ca-trust-extracted\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.848806 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848740 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpfnq\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-kube-api-access-hpfnq\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.848806 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.848771 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-bound-sa-token\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.849081 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.849065 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" May 11 20:50:55.849195 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.849093 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" May 11 20:50:55.849195 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.849127 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qhtb5\"" May 11 20:50:55.849334 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.849290 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" May 11 20:50:55.949676 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-bound-sa-token\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949688 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-dbus\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949734 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-original-pull-secret\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-kubelet-config\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-certificates\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7kj\" (UniqueName: \"kubernetes.io/projected/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-kube-api-access-8b7kj\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:55.949845 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:55.949862 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:55.949866 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695f55c9c8-p9zz5: secret "image-registry-tls" not found May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949929 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-trusted-ca\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:55.949937 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls podName:bf01ba23-7a3f-4b4d-8233-4e0819e6bb94 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:56.449916083 +0000 UTC m=+34.559696508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls") pod "image-registry-695f55c9c8-p9zz5" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94") : secret "image-registry-tls" not found May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.949990 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-image-registry-private-configuration\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-tmp-dir\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950050 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-installation-pull-secrets\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950102 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-config-volume\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950140 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-ca-trust-extracted\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950169 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpfnq\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-kube-api-access-hpfnq\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqc8\" (UniqueName: \"kubernetes.io/projected/979c2460-155e-4ca9-97a7-69b6b59a3dcb-kube-api-access-qwqc8\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:55.950296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950221 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:55.950822 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-certificates\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.950822 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950601 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-ca-trust-extracted\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.950822 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.950670 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-trusted-ca\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.955258 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.955239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-installation-pull-secrets\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.955258 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.955249 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-image-registry-private-configuration\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.958785 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.958763 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-bound-sa-token\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:55.958892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:55.958873 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpfnq\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-kube-api-access-hpfnq\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:56.050980 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.050932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-dbus\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:56.051129 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-original-pull-secret\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:56.051129 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051040 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-kubelet-config\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:56.051129 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7kj\" (UniqueName: \"kubernetes.io/projected/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-kube-api-access-8b7kj\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.051284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051135 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-dbus\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:56.051284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051148 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-tmp-dir\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.051284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051148 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-kubelet-config\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:56.051284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:56.051284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-config-volume\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.051284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqc8\" (UniqueName: \"kubernetes.io/projected/979c2460-155e-4ca9-97a7-69b6b59a3dcb-kube-api-access-qwqc8\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:56.051284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.051622 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.051304 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:56.051622 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.051366 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:56.051622 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.051370 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert podName:979c2460-155e-4ca9-97a7-69b6b59a3dcb nodeName:}" failed. No retries permitted until 2026-05-11 20:50:56.551352494 +0000 UTC m=+34.661132918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert") pod "ingress-canary-r2pks" (UID: "979c2460-155e-4ca9-97a7-69b6b59a3dcb") : secret "canary-serving-cert" not found May 11 20:50:56.051622 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.051412 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls podName:c72e4a76-101f-44bb-abd9-0c5f9b123dfc nodeName:}" failed. No retries permitted until 2026-05-11 20:50:56.551395659 +0000 UTC m=+34.661176088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls") pod "dns-default-6l7gb" (UID: "c72e4a76-101f-44bb-abd9-0c5f9b123dfc") : secret "dns-default-metrics-tls" not found May 11 20:50:56.051622 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-tmp-dir\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.051879 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.051779 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-config-volume\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.053345 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.053327 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1b8e168-86e5-4ad6-b105-311a1c00b2ea-original-pull-secret\") pod \"global-pull-secret-syncer-l8f8d\" (UID: \"f1b8e168-86e5-4ad6-b105-311a1c00b2ea\") " pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:56.060728 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.060696 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7kj\" (UniqueName: \"kubernetes.io/projected/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-kube-api-access-8b7kj\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.060973 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.060943 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqc8\" (UniqueName: \"kubernetes.io/projected/979c2460-155e-4ca9-97a7-69b6b59a3dcb-kube-api-access-qwqc8\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:56.079785 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.079765 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l8f8d" May 11 20:50:56.151857 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.151821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:56.152022 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.152008 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:56.152092 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.152031 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:56.152092 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.152044 2567 projected.go:194] Error preparing data for projected volume kube-api-access-cbq6v for pod openshift-network-diagnostics/network-check-target-m9tgf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:56.152172 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.152107 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v podName:3a2d13ea-d235-437e-9668-e21aca93682a nodeName:}" failed. No retries permitted until 2026-05-11 20:51:28.152092457 +0000 UTC m=+66.261872881 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbq6v" (UniqueName: "kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v") pod "network-check-target-m9tgf" (UID: "3a2d13ea-d235-437e-9668-e21aca93682a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:56.252799 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.252768 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:56.253014 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.252943 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:56.253078 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.253034 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs podName:3800edc1-af00-418d-a5b8-d832cbe20fbf nodeName:}" failed. No retries permitted until 2026-05-11 20:51:28.253018707 +0000 UTC m=+66.362799130 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs") pod "network-metrics-daemon-v9s7z" (UID: "3800edc1-af00-418d-a5b8-d832cbe20fbf") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:56.453743 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.453657 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:56.453888 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.453820 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:56.453888 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.453838 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695f55c9c8-p9zz5: secret "image-registry-tls" not found May 11 20:50:56.453995 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.453906 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls podName:bf01ba23-7a3f-4b4d-8233-4e0819e6bb94 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:57.453886742 +0000 UTC m=+35.563667164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls") pod "image-registry-695f55c9c8-p9zz5" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94") : secret "image-registry-tls" not found May 11 20:50:56.554583 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.554554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:56.555025 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:56.554592 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:56.555025 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.554725 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:56.555025 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.554792 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:56.555025 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.554794 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert podName:979c2460-155e-4ca9-97a7-69b6b59a3dcb nodeName:}" failed. No retries permitted until 2026-05-11 20:50:57.554774367 +0000 UTC m=+35.664554802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert") pod "ingress-canary-r2pks" (UID: "979c2460-155e-4ca9-97a7-69b6b59a3dcb") : secret "canary-serving-cert" not found May 11 20:50:56.555025 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:56.554851 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls podName:c72e4a76-101f-44bb-abd9-0c5f9b123dfc nodeName:}" failed. No retries permitted until 2026-05-11 20:50:57.554835198 +0000 UTC m=+35.664615621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls") pod "dns-default-6l7gb" (UID: "c72e4a76-101f-44bb-abd9-0c5f9b123dfc") : secret "dns-default-metrics-tls" not found May 11 20:50:57.462083 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.462048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:57.462269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:57.462212 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:57.462269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:57.462229 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695f55c9c8-p9zz5: secret "image-registry-tls" not found May 11 20:50:57.462366 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:57.462284 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls podName:bf01ba23-7a3f-4b4d-8233-4e0819e6bb94 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.462265965 +0000 UTC m=+37.572046388 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls") pod "image-registry-695f55c9c8-p9zz5" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94") : secret "image-registry-tls" not found May 11 20:50:57.475277 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.475248 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:50:57.475277 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.475264 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:50:57.479208 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.479186 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 11 20:50:57.479318 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.479244 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 11 20:50:57.479318 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.479255 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mdvls\"" May 11 20:50:57.479318 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.479247 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kqkzb\"" May 11 20:50:57.479474 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.479251 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 11 20:50:57.563167 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.563136 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:57.563578 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:57.563193 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:57.563578 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:57.563298 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:57.563578 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:57.563335 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:57.563578 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:57.563358 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert podName:979c2460-155e-4ca9-97a7-69b6b59a3dcb nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.563343675 +0000 UTC m=+37.673124117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert") pod "ingress-canary-r2pks" (UID: "979c2460-155e-4ca9-97a7-69b6b59a3dcb") : secret "canary-serving-cert" not found May 11 20:50:57.563578 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:57.563382 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls podName:c72e4a76-101f-44bb-abd9-0c5f9b123dfc nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.563369238 +0000 UTC m=+37.673149661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls") pod "dns-default-6l7gb" (UID: "c72e4a76-101f-44bb-abd9-0c5f9b123dfc") : secret "dns-default-metrics-tls" not found May 11 20:50:58.320242 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:58.320214 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l8f8d"] May 11 20:50:58.389439 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:50:58.389405 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1b8e168_86e5_4ad6_b105_311a1c00b2ea.slice/crio-2912d2e0958971185af9924cc4350558632636107ec9ca1d981057de284d4d50 WatchSource:0}: Error finding container 2912d2e0958971185af9924cc4350558632636107ec9ca1d981057de284d4d50: Status 404 returned error can't find the container with id 2912d2e0958971185af9924cc4350558632636107ec9ca1d981057de284d4d50 May 11 20:50:58.681514 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:58.681317 2567 generic.go:358] "Generic (PLEG): container finished" podID="3eb3a067-139c-450e-b053-3f1a84abc363" containerID="e9a5315d14e07daf2e3ca5749f195027d991add4ea41076975c78a44b38cdec4" exitCode=0 May 11 20:50:58.681514 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:58.681399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerDied","Data":"e9a5315d14e07daf2e3ca5749f195027d991add4ea41076975c78a44b38cdec4"} May 11 20:50:58.682526 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:58.682501 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l8f8d" event={"ID":"f1b8e168-86e5-4ad6-b105-311a1c00b2ea","Type":"ContainerStarted","Data":"2912d2e0958971185af9924cc4350558632636107ec9ca1d981057de284d4d50"} May 11 20:50:59.478533 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:59.478484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:50:59.478723 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:59.478619 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:59.478723 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:59.478638 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695f55c9c8-p9zz5: secret "image-registry-tls" not found May 11 20:50:59.478723 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:59.478705 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls podName:bf01ba23-7a3f-4b4d-8233-4e0819e6bb94 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:03.478685602 +0000 UTC m=+41.588466030 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls") pod "image-registry-695f55c9c8-p9zz5" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94") : secret "image-registry-tls" not found May 11 20:50:59.579669 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:59.579637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:50:59.579795 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:59.579691 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:50:59.579795 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:59.579764 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:59.579889 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:59.579816 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert podName:979c2460-155e-4ca9-97a7-69b6b59a3dcb nodeName:}" failed. No retries permitted until 2026-05-11 20:51:03.579799703 +0000 UTC m=+41.689580137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert") pod "ingress-canary-r2pks" (UID: "979c2460-155e-4ca9-97a7-69b6b59a3dcb") : secret "canary-serving-cert" not found May 11 20:50:59.579889 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:59.579878 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:59.580013 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:50:59.579939 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls podName:c72e4a76-101f-44bb-abd9-0c5f9b123dfc nodeName:}" failed. No retries permitted until 2026-05-11 20:51:03.579919901 +0000 UTC m=+41.689700324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls") pod "dns-default-6l7gb" (UID: "c72e4a76-101f-44bb-abd9-0c5f9b123dfc") : secret "dns-default-metrics-tls" not found May 11 20:50:59.687656 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:59.687625 2567 generic.go:358] "Generic (PLEG): container finished" podID="3eb3a067-139c-450e-b053-3f1a84abc363" containerID="5ff13ed31cdcdaa8d3029a08598658bdb50fee97b387a01032415894c9e7b24e" exitCode=0 May 11 20:50:59.688178 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:50:59.687712 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerDied","Data":"5ff13ed31cdcdaa8d3029a08598658bdb50fee97b387a01032415894c9e7b24e"} May 11 20:51:00.647323 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.647291 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9"] May 11 20:51:00.649985 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.649948 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" May 11 20:51:00.652896 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.652820 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" May 11 20:51:00.654168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.654145 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-gtwrv\"" May 11 20:51:00.654168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.654160 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" May 11 20:51:00.663839 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.662447 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9"] May 11 20:51:00.693438 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.693408 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" event={"ID":"3eb3a067-139c-450e-b053-3f1a84abc363","Type":"ContainerStarted","Data":"3efe9842aac3e31d6b97cadad4cc2cdc2b6fd334d44a7f4601b105b9b0f39238"} May 11 20:51:00.716944 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.716899 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l8r5d" podStartSLOduration=5.299882354 podStartE2EDuration="38.716883994s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:24.998649962 +0000 UTC m=+3.108430399" lastFinishedPulling="2026-05-11 20:50:58.415651616 +0000 UTC m=+36.525432039" observedRunningTime="2026-05-11 20:51:00.715374311 +0000 UTC m=+38.825154757" watchObservedRunningTime="2026-05-11 20:51:00.716883994 +0000 UTC m=+38.826664439" May 11 20:51:00.789360 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.789330 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptct\" (UniqueName: \"kubernetes.io/projected/b6c1bfc8-86ab-47ad-ac81-740207bf8f06-kube-api-access-nptct\") pod \"migrator-5f598d4645-pnbh9\" (UID: \"b6c1bfc8-86ab-47ad-ac81-740207bf8f06\") " pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" May 11 20:51:00.890144 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.890095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nptct\" (UniqueName: \"kubernetes.io/projected/b6c1bfc8-86ab-47ad-ac81-740207bf8f06-kube-api-access-nptct\") pod \"migrator-5f598d4645-pnbh9\" (UID: \"b6c1bfc8-86ab-47ad-ac81-740207bf8f06\") " pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" May 11 20:51:00.901387 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.901328 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptct\" (UniqueName: \"kubernetes.io/projected/b6c1bfc8-86ab-47ad-ac81-740207bf8f06-kube-api-access-nptct\") pod \"migrator-5f598d4645-pnbh9\" (UID: \"b6c1bfc8-86ab-47ad-ac81-740207bf8f06\") " pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" May 11 20:51:00.959303 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:00.959261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" May 11 20:51:01.156067 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:01.155987 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wdhpf_8981c6f1-07ce-4ebe-9071-6caf7218306a/dns-node-resolver/0.log" May 11 20:51:01.955622 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:01.955596 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cv7k6_dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96/node-ca/0.log" May 11 20:51:02.150774 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:02.150746 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9"] May 11 20:51:02.155888 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:02.155860 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c1bfc8_86ab_47ad_ac81_740207bf8f06.slice/crio-3bad74020c9293003af9a414ffa016527fa374a8a920260ff7eeb015c8128e9c WatchSource:0}: Error finding container 3bad74020c9293003af9a414ffa016527fa374a8a920260ff7eeb015c8128e9c: Status 404 returned error can't find the container with id 3bad74020c9293003af9a414ffa016527fa374a8a920260ff7eeb015c8128e9c May 11 20:51:02.699424 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:02.699092 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l8f8d" event={"ID":"f1b8e168-86e5-4ad6-b105-311a1c00b2ea","Type":"ContainerStarted","Data":"b09fef61affcf4daadcfe52731e9857ba39c5ef64f05f86d5161bdac3a5268d9"} May 11 20:51:02.700311 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:02.700285 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" event={"ID":"b6c1bfc8-86ab-47ad-ac81-740207bf8f06","Type":"ContainerStarted","Data":"3bad74020c9293003af9a414ffa016527fa374a8a920260ff7eeb015c8128e9c"} May 11 20:51:02.714754 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:02.714713 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-l8f8d" podStartSLOduration=4.01648871 podStartE2EDuration="7.714699784s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:50:58.39349566 +0000 UTC m=+36.503276084" lastFinishedPulling="2026-05-11 20:51:02.091706732 +0000 UTC m=+40.201487158" observedRunningTime="2026-05-11 20:51:02.714228967 +0000 UTC m=+40.824009414" watchObservedRunningTime="2026-05-11 20:51:02.714699784 +0000 UTC m=+40.824480228" May 11 20:51:02.995305 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:02.995223 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-7fd896b478-6ddlr"] May 11 20:51:02.998078 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:02.998063 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.000712 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.000681 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" May 11 20:51:03.000809 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.000721 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" May 11 20:51:03.001014 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.000998 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-4jbdf\"" May 11 20:51:03.001067 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.001027 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" May 11 20:51:03.001842 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.001829 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" May 11 20:51:03.003774 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.003756 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-signing-key\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.003843 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.003798 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbz5\" (UniqueName: \"kubernetes.io/projected/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-kube-api-access-5tbz5\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.004040 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.003987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-signing-cabundle\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.005733 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.005687 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-7fd896b478-6ddlr"] May 11 20:51:03.104988 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.104941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-signing-key\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.105128 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.105008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbz5\" (UniqueName: \"kubernetes.io/projected/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-kube-api-access-5tbz5\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.105128 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.105071 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-signing-cabundle\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.105786 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.105760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-signing-cabundle\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.107501 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.107480 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-signing-key\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.114028 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.114005 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbz5\" (UniqueName: \"kubernetes.io/projected/b49f3e4b-545a-4b2f-b507-e26d62e4bac8-kube-api-access-5tbz5\") pod \"service-ca-7fd896b478-6ddlr\" (UID: \"b49f3e4b-545a-4b2f-b507-e26d62e4bac8\") " pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.307778 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.307750 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" May 11 20:51:03.432835 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.432792 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-7fd896b478-6ddlr"] May 11 20:51:03.436246 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:03.436222 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb49f3e4b_545a_4b2f_b507_e26d62e4bac8.slice/crio-da7fce3fe62477c8d37b43c5c79e597daaf451113123686a9a5ae24fa58f08be WatchSource:0}: Error finding container da7fce3fe62477c8d37b43c5c79e597daaf451113123686a9a5ae24fa58f08be: Status 404 returned error can't find the container with id da7fce3fe62477c8d37b43c5c79e597daaf451113123686a9a5ae24fa58f08be May 11 20:51:03.507112 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.507083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:03.507253 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:03.507233 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:51:03.507314 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:03.507259 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-695f55c9c8-p9zz5: secret "image-registry-tls" not found May 11 20:51:03.507314 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:03.507313 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls podName:bf01ba23-7a3f-4b4d-8233-4e0819e6bb94 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:11.507297059 +0000 UTC m=+49.617077489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls") pod "image-registry-695f55c9c8-p9zz5" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94") : secret "image-registry-tls" not found May 11 20:51:03.607761 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.607685 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:51:03.607761 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.607724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:51:03.607981 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:03.607825 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:51:03.607981 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:03.607830 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:51:03.607981 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:03.607880 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls podName:c72e4a76-101f-44bb-abd9-0c5f9b123dfc nodeName:}" failed. No retries permitted until 2026-05-11 20:51:11.607864214 +0000 UTC m=+49.717644639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls") pod "dns-default-6l7gb" (UID: "c72e4a76-101f-44bb-abd9-0c5f9b123dfc") : secret "dns-default-metrics-tls" not found May 11 20:51:03.607981 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:03.607918 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert podName:979c2460-155e-4ca9-97a7-69b6b59a3dcb nodeName:}" failed. No retries permitted until 2026-05-11 20:51:11.607898941 +0000 UTC m=+49.717679375 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert") pod "ingress-canary-r2pks" (UID: "979c2460-155e-4ca9-97a7-69b6b59a3dcb") : secret "canary-serving-cert" not found May 11 20:51:03.703703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.703673 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" event={"ID":"b6c1bfc8-86ab-47ad-ac81-740207bf8f06","Type":"ContainerStarted","Data":"8fe2d75ee6e1300a1df3d84d515a92b11f7bed5dd7b4231a1b4d64702bacebc6"} May 11 20:51:03.703856 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.703712 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" event={"ID":"b6c1bfc8-86ab-47ad-ac81-740207bf8f06","Type":"ContainerStarted","Data":"2c7ebabebe8d50a4cca9c4dfbf6bf5dd21284199f30b00a3ef3b02305bb7e607"} May 11 20:51:03.704971 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.704928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" event={"ID":"b49f3e4b-545a-4b2f-b507-e26d62e4bac8","Type":"ContainerStarted","Data":"da7fce3fe62477c8d37b43c5c79e597daaf451113123686a9a5ae24fa58f08be"} May 11 20:51:03.720640 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:03.720598 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-5f598d4645-pnbh9" podStartSLOduration=2.563653453 podStartE2EDuration="3.720587153s" podCreationTimestamp="2026-05-11 20:51:00 +0000 UTC" firstStartedPulling="2026-05-11 20:51:02.158020091 +0000 UTC m=+40.267800517" lastFinishedPulling="2026-05-11 20:51:03.314953783 +0000 UTC m=+41.424734217" observedRunningTime="2026-05-11 20:51:03.719569178 +0000 UTC m=+41.829349636" watchObservedRunningTime="2026-05-11 20:51:03.720587153 +0000 UTC m=+41.830367597" May 11 20:51:06.711671 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:06.711639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" event={"ID":"b49f3e4b-545a-4b2f-b507-e26d62e4bac8","Type":"ContainerStarted","Data":"7f44c28a33078883ea419ecf2228880e13565432232016a149c4a2d3b4fc9d32"} May 11 20:51:06.728461 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:06.728420 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-7fd896b478-6ddlr" podStartSLOduration=2.494936412 podStartE2EDuration="4.728406398s" podCreationTimestamp="2026-05-11 20:51:02 +0000 UTC" firstStartedPulling="2026-05-11 20:51:03.438196815 +0000 UTC m=+41.547977237" lastFinishedPulling="2026-05-11 20:51:05.671666792 +0000 UTC m=+43.781447223" observedRunningTime="2026-05-11 20:51:06.72769042 +0000 UTC m=+44.837470879" watchObservedRunningTime="2026-05-11 20:51:06.728406398 +0000 UTC m=+44.838186842" May 11 20:51:11.564466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.564431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:11.566728 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.566701 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"image-registry-695f55c9c8-p9zz5\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:11.652087 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.652062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:11.664867 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.664842 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:51:11.665005 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.664885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:51:11.667272 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.667243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c72e4a76-101f-44bb-abd9-0c5f9b123dfc-metrics-tls\") pod \"dns-default-6l7gb\" (UID: \"c72e4a76-101f-44bb-abd9-0c5f9b123dfc\") " pod="openshift-dns/dns-default-6l7gb" May 11 20:51:11.667359 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.667304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/979c2460-155e-4ca9-97a7-69b6b59a3dcb-cert\") pod \"ingress-canary-r2pks\" (UID: \"979c2460-155e-4ca9-97a7-69b6b59a3dcb\") " pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:51:11.742281 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.741779 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6l7gb" May 11 20:51:11.756314 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.756285 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2pks" May 11 20:51:11.797662 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.797292 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-695f55c9c8-p9zz5"] May 11 20:51:11.807030 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:11.806997 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf01ba23_7a3f_4b4d_8233_4e0819e6bb94.slice/crio-3de984c7cee21f3cc43ea22ffa889b2e70d4206c2cd47e2c0e34102218b004f4 WatchSource:0}: Error finding container 3de984c7cee21f3cc43ea22ffa889b2e70d4206c2cd47e2c0e34102218b004f4: Status 404 returned error can't find the container with id 3de984c7cee21f3cc43ea22ffa889b2e70d4206c2cd47e2c0e34102218b004f4 May 11 20:51:11.890293 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.890240 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6l7gb"] May 11 20:51:11.897020 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:11.896992 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72e4a76_101f_44bb_abd9_0c5f9b123dfc.slice/crio-ed6da40395e6e2e91e6b11f53a441eadcccc8f1049fdb9d7e5930634baea99c6 WatchSource:0}: Error finding container ed6da40395e6e2e91e6b11f53a441eadcccc8f1049fdb9d7e5930634baea99c6: Status 404 returned error can't find the container with id ed6da40395e6e2e91e6b11f53a441eadcccc8f1049fdb9d7e5930634baea99c6 May 11 20:51:11.903816 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:11.903793 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2pks"] May 11 20:51:11.915418 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:11.915395 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod979c2460_155e_4ca9_97a7_69b6b59a3dcb.slice/crio-e8344146f6907e8dab244cca43191cac418495acea5c92c8b5bb9e118da05ee5 WatchSource:0}: Error finding container e8344146f6907e8dab244cca43191cac418495acea5c92c8b5bb9e118da05ee5: Status 404 returned error can't find the container with id e8344146f6907e8dab244cca43191cac418495acea5c92c8b5bb9e118da05ee5 May 11 20:51:12.725072 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:12.724997 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2pks" event={"ID":"979c2460-155e-4ca9-97a7-69b6b59a3dcb","Type":"ContainerStarted","Data":"e8344146f6907e8dab244cca43191cac418495acea5c92c8b5bb9e118da05ee5"} May 11 20:51:12.726951 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:12.726927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6l7gb" event={"ID":"c72e4a76-101f-44bb-abd9-0c5f9b123dfc","Type":"ContainerStarted","Data":"ed6da40395e6e2e91e6b11f53a441eadcccc8f1049fdb9d7e5930634baea99c6"} May 11 20:51:12.728467 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:12.728421 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" event={"ID":"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94","Type":"ContainerStarted","Data":"1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621"} May 11 20:51:12.728467 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:12.728448 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" event={"ID":"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94","Type":"ContainerStarted","Data":"3de984c7cee21f3cc43ea22ffa889b2e70d4206c2cd47e2c0e34102218b004f4"} May 11 20:51:12.728680 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:12.728641 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:12.747863 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:12.747822 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" podStartSLOduration=55.747799991 podStartE2EDuration="55.747799991s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:51:12.747690813 +0000 UTC m=+50.857471273" watchObservedRunningTime="2026-05-11 20:51:12.747799991 +0000 UTC m=+50.857580449" May 11 20:51:14.734825 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:14.734795 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6l7gb" event={"ID":"c72e4a76-101f-44bb-abd9-0c5f9b123dfc","Type":"ContainerStarted","Data":"1cbf7b941d57119465d5ff4aed99ffe0192dc6b151bdae5d6121882708a58c92"} May 11 20:51:14.736084 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:14.736061 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2pks" event={"ID":"979c2460-155e-4ca9-97a7-69b6b59a3dcb","Type":"ContainerStarted","Data":"b1dac4127bd48711da4edd5ff308a4a9f1a1024e3125aeaf9bdfe169d53e1153"} May 11 20:51:14.753938 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:14.753868 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r2pks" podStartSLOduration=17.18389483 podStartE2EDuration="19.753851676s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:51:11.918105573 +0000 UTC m=+50.027885996" lastFinishedPulling="2026-05-11 20:51:14.488062417 +0000 UTC m=+52.597842842" observedRunningTime="2026-05-11 20:51:14.752468044 +0000 UTC m=+52.862248489" watchObservedRunningTime="2026-05-11 20:51:14.753851676 +0000 UTC m=+52.863632125" May 11 20:51:15.740510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:15.740472 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6l7gb" event={"ID":"c72e4a76-101f-44bb-abd9-0c5f9b123dfc","Type":"ContainerStarted","Data":"6e036ee07137cb89de7658307c2d5cc53a800b97f955ec6b6b7a3901148b8b67"} May 11 20:51:15.757130 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:15.757086 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6l7gb" podStartSLOduration=18.175242346 podStartE2EDuration="20.757072831s" podCreationTimestamp="2026-05-11 20:50:55 +0000 UTC" firstStartedPulling="2026-05-11 20:51:11.898918307 +0000 UTC m=+50.008698730" lastFinishedPulling="2026-05-11 20:51:14.480748789 +0000 UTC m=+52.590529215" observedRunningTime="2026-05-11 20:51:15.756723519 +0000 UTC m=+53.866503994" watchObservedRunningTime="2026-05-11 20:51:15.757072831 +0000 UTC m=+53.866853277" May 11 20:51:16.743152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:16.743116 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6l7gb" May 11 20:51:21.673507 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:21.673480 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-svtmh" May 11 20:51:23.965974 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.965924 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47"] May 11 20:51:23.970487 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.970457 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh"] May 11 20:51:23.970618 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.970587 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:23.973265 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.973245 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:23.974278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.974253 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" May 11 20:51:23.974278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.974267 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" May 11 20:51:23.974461 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.974284 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" May 11 20:51:23.974461 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.974289 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lql9k\"" May 11 20:51:23.974461 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.974398 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" May 11 20:51:23.975505 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.975492 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" May 11 20:51:23.977866 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.977846 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47"] May 11 20:51:23.979003 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:23.978903 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh"] May 11 20:51:24.001310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.001287 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-695f55c9c8-p9zz5"] May 11 20:51:24.045849 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.045826 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76d66cd8bd-tfgnz"] May 11 20:51:24.048703 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.048688 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.052785 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.052768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nj97\" (UniqueName: \"kubernetes.io/projected/1d510481-6ab8-4cb2-be3b-8ad8c889f986-kube-api-access-7nj97\") pod \"managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47\" (UID: \"1d510481-6ab8-4cb2-be3b-8ad8c889f986\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:24.052871 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.052793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/650821ee-1690-42ba-af3e-069dbf5c0b6b-tmp\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.052871 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.052833 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d510481-6ab8-4cb2-be3b-8ad8c889f986-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47\" (UID: \"1d510481-6ab8-4cb2-be3b-8ad8c889f986\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:24.052987 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.052874 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmqm\" (UniqueName: \"kubernetes.io/projected/650821ee-1690-42ba-af3e-069dbf5c0b6b-kube-api-access-wqmqm\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.052987 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.052980 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/650821ee-1690-42ba-af3e-069dbf5c0b6b-klusterlet-config\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.061766 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.061747 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76d66cd8bd-tfgnz"] May 11 20:51:24.079530 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.079507 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9cp9w"] May 11 20:51:24.082494 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.082460 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.085249 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.085233 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s999c\"" May 11 20:51:24.085350 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.085341 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" May 11 20:51:24.085677 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.085662 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" May 11 20:51:24.086923 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.086908 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" May 11 20:51:24.088923 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.088908 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" May 11 20:51:24.099250 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.099230 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9cp9w"] May 11 20:51:24.153968 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.153941 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/650821ee-1690-42ba-af3e-069dbf5c0b6b-tmp\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.154086 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.153998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1a279daf-6a19-4521-916d-10598e20c36a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.154086 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154042 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1ed91ff5-f4c6-422d-b993-1c73812e6c81-image-registry-private-configuration\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.154086 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlq5s\" (UniqueName: \"kubernetes.io/projected/1a279daf-6a19-4521-916d-10598e20c36a-kube-api-access-hlq5s\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.154202 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154100 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-registry-tls\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.154202 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154117 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8cc\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-kube-api-access-9v8cc\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.154202 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nj97\" (UniqueName: \"kubernetes.io/projected/1d510481-6ab8-4cb2-be3b-8ad8c889f986-kube-api-access-7nj97\") pod \"managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47\" (UID: \"1d510481-6ab8-4cb2-be3b-8ad8c889f986\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:24.154202 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154166 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/650821ee-1690-42ba-af3e-069dbf5c0b6b-klusterlet-config\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.154342 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a279daf-6a19-4521-916d-10598e20c36a-data-volume\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.154342 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154232 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-bound-sa-token\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.154342 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154262 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1a279daf-6a19-4521-916d-10598e20c36a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.154342 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/650821ee-1690-42ba-af3e-069dbf5c0b6b-tmp\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.154479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154350 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d510481-6ab8-4cb2-be3b-8ad8c889f986-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47\" (UID: \"1d510481-6ab8-4cb2-be3b-8ad8c889f986\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:24.154479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1ed91ff5-f4c6-422d-b993-1c73812e6c81-installation-pull-secrets\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.154479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1ed91ff5-f4c6-422d-b993-1c73812e6c81-ca-trust-extracted\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.154479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154433 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmqm\" (UniqueName: \"kubernetes.io/projected/650821ee-1690-42ba-af3e-069dbf5c0b6b-kube-api-access-wqmqm\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.154479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1a279daf-6a19-4521-916d-10598e20c36a-crio-socket\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.154479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1ed91ff5-f4c6-422d-b993-1c73812e6c81-registry-certificates\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.154725 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.154507 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed91ff5-f4c6-422d-b993-1c73812e6c81-trusted-ca\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.156669 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.156640 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1d510481-6ab8-4cb2-be3b-8ad8c889f986-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47\" (UID: \"1d510481-6ab8-4cb2-be3b-8ad8c889f986\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:24.156740 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.156691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/650821ee-1690-42ba-af3e-069dbf5c0b6b-klusterlet-config\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.165782 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.165758 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmqm\" (UniqueName: \"kubernetes.io/projected/650821ee-1690-42ba-af3e-069dbf5c0b6b-kube-api-access-wqmqm\") pod \"klusterlet-addon-workmgr-677c659d98-2pgvh\" (UID: \"650821ee-1690-42ba-af3e-069dbf5c0b6b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.165863 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.165806 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nj97\" (UniqueName: \"kubernetes.io/projected/1d510481-6ab8-4cb2-be3b-8ad8c889f986-kube-api-access-7nj97\") pod \"managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47\" (UID: \"1d510481-6ab8-4cb2-be3b-8ad8c889f986\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:24.255150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255093 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1ed91ff5-f4c6-422d-b993-1c73812e6c81-image-registry-private-configuration\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlq5s\" (UniqueName: \"kubernetes.io/projected/1a279daf-6a19-4521-916d-10598e20c36a-kube-api-access-hlq5s\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.255150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255141 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-registry-tls\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8cc\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-kube-api-access-9v8cc\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a279daf-6a19-4521-916d-10598e20c36a-data-volume\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.255310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255214 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-bound-sa-token\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255242 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1a279daf-6a19-4521-916d-10598e20c36a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.255310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1ed91ff5-f4c6-422d-b993-1c73812e6c81-installation-pull-secrets\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255594 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1ed91ff5-f4c6-422d-b993-1c73812e6c81-ca-trust-extracted\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255594 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255506 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1a279daf-6a19-4521-916d-10598e20c36a-data-volume\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.255594 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255545 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1a279daf-6a19-4521-916d-10598e20c36a-crio-socket\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.255594 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1ed91ff5-f4c6-422d-b993-1c73812e6c81-registry-certificates\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255786 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1a279daf-6a19-4521-916d-10598e20c36a-crio-socket\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.255786 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255652 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1ed91ff5-f4c6-422d-b993-1c73812e6c81-ca-trust-extracted\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255786 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255681 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed91ff5-f4c6-422d-b993-1c73812e6c81-trusted-ca\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.255786 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255733 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1a279daf-6a19-4521-916d-10598e20c36a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.256011 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.255939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1a279daf-6a19-4521-916d-10598e20c36a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.256428 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.256405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1ed91ff5-f4c6-422d-b993-1c73812e6c81-registry-certificates\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.256833 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.256809 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ed91ff5-f4c6-422d-b993-1c73812e6c81-trusted-ca\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.257662 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.257632 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1ed91ff5-f4c6-422d-b993-1c73812e6c81-image-registry-private-configuration\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.257662 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.257653 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-registry-tls\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.257774 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.257699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1ed91ff5-f4c6-422d-b993-1c73812e6c81-installation-pull-secrets\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.257812 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.257781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1a279daf-6a19-4521-916d-10598e20c36a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.264003 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.263978 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-bound-sa-token\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.264554 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.264538 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlq5s\" (UniqueName: \"kubernetes.io/projected/1a279daf-6a19-4521-916d-10598e20c36a-kube-api-access-hlq5s\") pod \"insights-runtime-extractor-9cp9w\" (UID: \"1a279daf-6a19-4521-916d-10598e20c36a\") " pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.265230 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.265211 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8cc\" (UniqueName: \"kubernetes.io/projected/1ed91ff5-f4c6-422d-b993-1c73812e6c81-kube-api-access-9v8cc\") pod \"image-registry-76d66cd8bd-tfgnz\" (UID: \"1ed91ff5-f4c6-422d-b993-1c73812e6c81\") " pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.306866 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.306850 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" May 11 20:51:24.311505 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.311487 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:24.357728 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.357570 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.390445 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.390417 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9cp9w" May 11 20:51:24.439727 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.439652 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47"] May 11 20:51:24.456793 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:24.456499 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d510481_6ab8_4cb2_be3b_8ad8c889f986.slice/crio-58bc17c932e6c511dd34b5e7c3a0d6f10e960f2b7c903671b69b28e95a76f51e WatchSource:0}: Error finding container 58bc17c932e6c511dd34b5e7c3a0d6f10e960f2b7c903671b69b28e95a76f51e: Status 404 returned error can't find the container with id 58bc17c932e6c511dd34b5e7c3a0d6f10e960f2b7c903671b69b28e95a76f51e May 11 20:51:24.486802 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.486757 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh"] May 11 20:51:24.489891 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:24.489855 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod650821ee_1690_42ba_af3e_069dbf5c0b6b.slice/crio-10a2c9bb07b17230eff3be6a2595c438ecddc3c1520e1a2f727cce6b31e48683 WatchSource:0}: Error finding container 10a2c9bb07b17230eff3be6a2595c438ecddc3c1520e1a2f727cce6b31e48683: Status 404 returned error can't find the container with id 10a2c9bb07b17230eff3be6a2595c438ecddc3c1520e1a2f727cce6b31e48683 May 11 20:51:24.533584 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.532860 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76d66cd8bd-tfgnz"] May 11 20:51:24.537873 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:24.537847 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ed91ff5_f4c6_422d_b993_1c73812e6c81.slice/crio-43754f654296da3b1e9ae432e65dd70f8e0ed17231f17d3a53e270fd449dfcf2 WatchSource:0}: Error finding container 43754f654296da3b1e9ae432e65dd70f8e0ed17231f17d3a53e270fd449dfcf2: Status 404 returned error can't find the container with id 43754f654296da3b1e9ae432e65dd70f8e0ed17231f17d3a53e270fd449dfcf2 May 11 20:51:24.552673 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.552639 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9cp9w"] May 11 20:51:24.554655 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:24.554634 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a279daf_6a19_4521_916d_10598e20c36a.slice/crio-eb89d635c9229b10eb2c38029baad34ee6f3939c48a8e69e6585495f57732194 WatchSource:0}: Error finding container eb89d635c9229b10eb2c38029baad34ee6f3939c48a8e69e6585495f57732194: Status 404 returned error can't find the container with id eb89d635c9229b10eb2c38029baad34ee6f3939c48a8e69e6585495f57732194 May 11 20:51:24.765739 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.765703 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" event={"ID":"1ed91ff5-f4c6-422d-b993-1c73812e6c81","Type":"ContainerStarted","Data":"8cba13a2efefd5f3e3f29b807e930c8c191661a689b08963b2502d2945f0f19a"} May 11 20:51:24.765739 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.765742 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" event={"ID":"1ed91ff5-f4c6-422d-b993-1c73812e6c81","Type":"ContainerStarted","Data":"43754f654296da3b1e9ae432e65dd70f8e0ed17231f17d3a53e270fd449dfcf2"} May 11 20:51:24.765955 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.765834 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:24.767180 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.767157 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9cp9w" event={"ID":"1a279daf-6a19-4521-916d-10598e20c36a","Type":"ContainerStarted","Data":"90d9d4be8bca77681d06a5374bb3fc6c1b0ae5741891ba889c64c8dabd8edb56"} May 11 20:51:24.767180 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.767183 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9cp9w" event={"ID":"1a279daf-6a19-4521-916d-10598e20c36a","Type":"ContainerStarted","Data":"eb89d635c9229b10eb2c38029baad34ee6f3939c48a8e69e6585495f57732194"} May 11 20:51:24.768240 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.768211 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" event={"ID":"1d510481-6ab8-4cb2-be3b-8ad8c889f986","Type":"ContainerStarted","Data":"58bc17c932e6c511dd34b5e7c3a0d6f10e960f2b7c903671b69b28e95a76f51e"} May 11 20:51:24.769208 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.769187 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" event={"ID":"650821ee-1690-42ba-af3e-069dbf5c0b6b","Type":"ContainerStarted","Data":"10a2c9bb07b17230eff3be6a2595c438ecddc3c1520e1a2f727cce6b31e48683"} May 11 20:51:24.787944 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:24.787894 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" podStartSLOduration=0.787879314 podStartE2EDuration="787.879314ms" podCreationTimestamp="2026-05-11 20:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:51:24.787557247 +0000 UTC m=+62.897337720" watchObservedRunningTime="2026-05-11 20:51:24.787879314 +0000 UTC m=+62.897659760" May 11 20:51:25.775650 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:25.775617 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9cp9w" event={"ID":"1a279daf-6a19-4521-916d-10598e20c36a","Type":"ContainerStarted","Data":"e843d88debc2644faf24e08f09051df89f2993d465983874dbaea7bb06ca1999"} May 11 20:51:26.747273 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:26.747237 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6l7gb" May 11 20:51:28.192310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.192270 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:51:28.194984 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.194949 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 11 20:51:28.205718 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.205698 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 11 20:51:28.216522 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.216500 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbq6v\" (UniqueName: \"kubernetes.io/projected/3a2d13ea-d235-437e-9668-e21aca93682a-kube-api-access-cbq6v\") pod \"network-check-target-m9tgf\" (UID: \"3a2d13ea-d235-437e-9668-e21aca93682a\") " pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:51:28.293695 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.293660 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:51:28.296628 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.296609 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 11 20:51:28.306949 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.306929 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3800edc1-af00-418d-a5b8-d832cbe20fbf-metrics-certs\") pod \"network-metrics-daemon-v9s7z\" (UID: \"3800edc1-af00-418d-a5b8-d832cbe20fbf\") " pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:51:28.388636 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.388603 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kqkzb\"" May 11 20:51:28.395062 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.395041 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mdvls\"" May 11 20:51:28.396918 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.396895 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:51:28.403597 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:28.403579 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v9s7z" May 11 20:51:29.664273 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.664218 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v9s7z"] May 11 20:51:29.668698 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:29.668671 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3800edc1_af00_418d_a5b8_d832cbe20fbf.slice/crio-90f7d79f36c34ed2a415ebdc20b4985b264b86d7fb40a37608f7048d3ed95f0f WatchSource:0}: Error finding container 90f7d79f36c34ed2a415ebdc20b4985b264b86d7fb40a37608f7048d3ed95f0f: Status 404 returned error can't find the container with id 90f7d79f36c34ed2a415ebdc20b4985b264b86d7fb40a37608f7048d3ed95f0f May 11 20:51:29.687992 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.687425 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m9tgf"] May 11 20:51:29.694193 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:29.694166 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a2d13ea_d235_437e_9668_e21aca93682a.slice/crio-0d20cef91357d3ce0f96068360dbbac6e4f247b719496fd86f14a72e911d38f2 WatchSource:0}: Error finding container 0d20cef91357d3ce0f96068360dbbac6e4f247b719496fd86f14a72e911d38f2: Status 404 returned error can't find the container with id 0d20cef91357d3ce0f96068360dbbac6e4f247b719496fd86f14a72e911d38f2 May 11 20:51:29.787635 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.787606 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9cp9w" event={"ID":"1a279daf-6a19-4521-916d-10598e20c36a","Type":"ContainerStarted","Data":"9ffdd4f874a9b49daab42a3382734a0632170f844b5475673f244f19a7c07104"} May 11 20:51:29.788875 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.788850 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" event={"ID":"1d510481-6ab8-4cb2-be3b-8ad8c889f986","Type":"ContainerStarted","Data":"e82d94d56491bc969baea60ff6708409c59f5631cc166f22b5328a3b0f68e8f3"} May 11 20:51:29.789921 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.789903 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v9s7z" event={"ID":"3800edc1-af00-418d-a5b8-d832cbe20fbf","Type":"ContainerStarted","Data":"90f7d79f36c34ed2a415ebdc20b4985b264b86d7fb40a37608f7048d3ed95f0f"} May 11 20:51:29.791068 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.791046 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" event={"ID":"650821ee-1690-42ba-af3e-069dbf5c0b6b","Type":"ContainerStarted","Data":"e7924560d5b77a7d5eeb02eaae54cf355eb146cbb8ee94d68b0a941f51d52764"} May 11 20:51:29.791272 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.791255 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:29.792119 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.792101 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m9tgf" event={"ID":"3a2d13ea-d235-437e-9668-e21aca93682a","Type":"ContainerStarted","Data":"0d20cef91357d3ce0f96068360dbbac6e4f247b719496fd86f14a72e911d38f2"} May 11 20:51:29.792894 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.792875 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" May 11 20:51:29.808946 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.808910 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9cp9w" podStartSLOduration=0.885717117 podStartE2EDuration="5.808898792s" podCreationTimestamp="2026-05-11 20:51:24 +0000 UTC" firstStartedPulling="2026-05-11 20:51:24.602634955 +0000 UTC m=+62.712415386" lastFinishedPulling="2026-05-11 20:51:29.525816635 +0000 UTC m=+67.635597061" observedRunningTime="2026-05-11 20:51:29.807300226 +0000 UTC m=+67.917080693" watchObservedRunningTime="2026-05-11 20:51:29.808898792 +0000 UTC m=+67.918679294" May 11 20:51:29.823759 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.823635 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677c659d98-2pgvh" podStartSLOduration=1.7707354130000001 podStartE2EDuration="6.823623956s" podCreationTimestamp="2026-05-11 20:51:23 +0000 UTC" firstStartedPulling="2026-05-11 20:51:24.491799756 +0000 UTC m=+62.601580182" lastFinishedPulling="2026-05-11 20:51:29.544688301 +0000 UTC m=+67.654468725" observedRunningTime="2026-05-11 20:51:29.823352641 +0000 UTC m=+67.933133125" watchObservedRunningTime="2026-05-11 20:51:29.823623956 +0000 UTC m=+67.933404383" May 11 20:51:29.842754 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:29.842717 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4f94d7c4-zvc47" podStartSLOduration=1.775569856 podStartE2EDuration="6.842707367s" podCreationTimestamp="2026-05-11 20:51:23 +0000 UTC" firstStartedPulling="2026-05-11 20:51:24.45863826 +0000 UTC m=+62.568418699" lastFinishedPulling="2026-05-11 20:51:29.525775772 +0000 UTC m=+67.635556210" observedRunningTime="2026-05-11 20:51:29.84160219 +0000 UTC m=+67.951382646" watchObservedRunningTime="2026-05-11 20:51:29.842707367 +0000 UTC m=+67.952487811" May 11 20:51:31.799933 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:31.799510 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v9s7z" event={"ID":"3800edc1-af00-418d-a5b8-d832cbe20fbf","Type":"ContainerStarted","Data":"41c9482a56dd92540aead3895cda3299b45d8971eb651a33f57433c4c439aa72"} May 11 20:51:31.799933 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:31.799553 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v9s7z" event={"ID":"3800edc1-af00-418d-a5b8-d832cbe20fbf","Type":"ContainerStarted","Data":"cf0102d0a175de1940a9a1c4612b9c5411f9bd72dbcdabbf782fb21bc18eb5ed"} May 11 20:51:31.816151 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:31.816108 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v9s7z" podStartSLOduration=68.693462822 podStartE2EDuration="1m9.81609615s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:51:29.670998999 +0000 UTC m=+67.780779427" lastFinishedPulling="2026-05-11 20:51:30.793632317 +0000 UTC m=+68.903412755" observedRunningTime="2026-05-11 20:51:31.814917672 +0000 UTC m=+69.924698154" watchObservedRunningTime="2026-05-11 20:51:31.81609615 +0000 UTC m=+69.925876596" May 11 20:51:32.803804 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:32.803773 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m9tgf" event={"ID":"3a2d13ea-d235-437e-9668-e21aca93682a","Type":"ContainerStarted","Data":"1aea5b6ee9ab3c91b177cfaa20ea1292d956b46ad644011737e3b54e438f33fa"} May 11 20:51:32.818679 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:32.818634 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m9tgf" podStartSLOduration=68.027708284 podStartE2EDuration="1m10.818620634s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:51:29.696784948 +0000 UTC m=+67.806565373" lastFinishedPulling="2026-05-11 20:51:32.487697295 +0000 UTC m=+70.597477723" observedRunningTime="2026-05-11 20:51:32.818614145 +0000 UTC m=+70.928394593" watchObservedRunningTime="2026-05-11 20:51:32.818620634 +0000 UTC m=+70.928401079" May 11 20:51:33.806472 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:33.806441 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:51:34.007332 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:34.007305 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:40.441317 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.441286 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9ckmx"] May 11 20:51:40.446437 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.446411 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.449168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.449146 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-h27ck\"" May 11 20:51:40.450304 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.450283 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" May 11 20:51:40.450410 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.450317 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" May 11 20:51:40.450781 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.450564 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" May 11 20:51:40.450781 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.450618 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" May 11 20:51:40.450781 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.450691 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" May 11 20:51:40.450781 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.450742 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" May 11 20:51:40.576832 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.576807 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-sys\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.576839 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-wtmp\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.576858 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.576924 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-tls\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.577000 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577204 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.577036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f84d77f-8342-4c8f-9a13-c7c909d327d3-metrics-client-ca\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577204 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.577109 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msgcf\" (UniqueName: \"kubernetes.io/projected/6f84d77f-8342-4c8f-9a13-c7c909d327d3-kube-api-access-msgcf\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577204 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.577148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-root\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.577204 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.577173 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-textfile\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.677723 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.677698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-tls\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.677859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.677736 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.677859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.677763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f84d77f-8342-4c8f-9a13-c7c909d327d3-metrics-client-ca\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.677859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.677796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msgcf\" (UniqueName: \"kubernetes.io/projected/6f84d77f-8342-4c8f-9a13-c7c909d327d3-kube-api-access-msgcf\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.677859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.677823 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-root\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.677859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.677847 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-textfile\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678151 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.677937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-root\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678151 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-sys\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678151 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-wtmp\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678151 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678084 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678356 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-sys\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678356 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678238 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-textfile\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678465 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678355 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-wtmp\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678465 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.678613 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.678595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f84d77f-8342-4c8f-9a13-c7c909d327d3-metrics-client-ca\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.680126 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.680107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-tls\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.680239 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.680222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f84d77f-8342-4c8f-9a13-c7c909d327d3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.685860 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.685839 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msgcf\" (UniqueName: \"kubernetes.io/projected/6f84d77f-8342-4c8f-9a13-c7c909d327d3-kube-api-access-msgcf\") pod \"node-exporter-9ckmx\" (UID: \"6f84d77f-8342-4c8f-9a13-c7c909d327d3\") " pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.757478 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.757427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9ckmx" May 11 20:51:40.765212 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:40.765185 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f84d77f_8342_4c8f_9a13_c7c909d327d3.slice/crio-9cd1c9efbf564856a4c7206bc39aac74d3d0b0e22a907c80815afba083f4ffb6 WatchSource:0}: Error finding container 9cd1c9efbf564856a4c7206bc39aac74d3d0b0e22a907c80815afba083f4ffb6: Status 404 returned error can't find the container with id 9cd1c9efbf564856a4c7206bc39aac74d3d0b0e22a907c80815afba083f4ffb6 May 11 20:51:40.827473 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:40.827444 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ckmx" event={"ID":"6f84d77f-8342-4c8f-9a13-c7c909d327d3","Type":"ContainerStarted","Data":"9cd1c9efbf564856a4c7206bc39aac74d3d0b0e22a907c80815afba083f4ffb6"} May 11 20:51:41.832914 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:41.832880 2567 generic.go:358] "Generic (PLEG): container finished" podID="6f84d77f-8342-4c8f-9a13-c7c909d327d3" containerID="c4862513d615ad51521868ff269d81a9bfe80e605168d386050283146bb07505" exitCode=0 May 11 20:51:41.833225 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:41.832941 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ckmx" event={"ID":"6f84d77f-8342-4c8f-9a13-c7c909d327d3","Type":"ContainerDied","Data":"c4862513d615ad51521868ff269d81a9bfe80e605168d386050283146bb07505"} May 11 20:51:42.837162 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:42.837127 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ckmx" event={"ID":"6f84d77f-8342-4c8f-9a13-c7c909d327d3","Type":"ContainerStarted","Data":"97165f335808065641c405b3e06c40c6bbc76c526fecdd90b33a8a50c0ac9c0b"} May 11 20:51:42.837162 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:42.837162 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9ckmx" event={"ID":"6f84d77f-8342-4c8f-9a13-c7c909d327d3","Type":"ContainerStarted","Data":"9a1ca348d555f30e8608d0b024cb0c4f028cf1c38f4dc04e729d21340aeea0a9"} May 11 20:51:42.861725 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:42.861677 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9ckmx" podStartSLOduration=1.998409706 podStartE2EDuration="2.861661709s" podCreationTimestamp="2026-05-11 20:51:40 +0000 UTC" firstStartedPulling="2026-05-11 20:51:40.766939646 +0000 UTC m=+78.876720084" lastFinishedPulling="2026-05-11 20:51:41.630191659 +0000 UTC m=+79.739972087" observedRunningTime="2026-05-11 20:51:42.860425525 +0000 UTC m=+80.970205970" watchObservedRunningTime="2026-05-11 20:51:42.861661709 +0000 UTC m=+80.971442155" May 11 20:51:45.779878 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:45.779843 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76d66cd8bd-tfgnz" May 11 20:51:48.016281 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.016249 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-c8c4474bc-4m7jz"] May 11 20:51:48.047045 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.047011 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-c8c4474bc-4m7jz"] May 11 20:51:48.047194 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.047129 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-c8c4474bc-4m7jz" May 11 20:51:48.049759 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.049734 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" May 11 20:51:48.049899 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.049809 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" May 11 20:51:48.049950 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.049913 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5fk8h\"" May 11 20:51:48.127501 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.127473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddc2w\" (UniqueName: \"kubernetes.io/projected/97d80799-b6a0-4fad-bb15-1b33a216aa97-kube-api-access-ddc2w\") pod \"downloads-c8c4474bc-4m7jz\" (UID: \"97d80799-b6a0-4fad-bb15-1b33a216aa97\") " pod="openshift-console/downloads-c8c4474bc-4m7jz" May 11 20:51:48.228729 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.228700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddc2w\" (UniqueName: \"kubernetes.io/projected/97d80799-b6a0-4fad-bb15-1b33a216aa97-kube-api-access-ddc2w\") pod \"downloads-c8c4474bc-4m7jz\" (UID: \"97d80799-b6a0-4fad-bb15-1b33a216aa97\") " pod="openshift-console/downloads-c8c4474bc-4m7jz" May 11 20:51:48.237516 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.237493 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddc2w\" (UniqueName: \"kubernetes.io/projected/97d80799-b6a0-4fad-bb15-1b33a216aa97-kube-api-access-ddc2w\") pod \"downloads-c8c4474bc-4m7jz\" (UID: \"97d80799-b6a0-4fad-bb15-1b33a216aa97\") " pod="openshift-console/downloads-c8c4474bc-4m7jz" May 11 20:51:48.355666 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.355640 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-c8c4474bc-4m7jz" May 11 20:51:48.483723 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:51:48.483696 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d80799_b6a0_4fad_bb15_1b33a216aa97.slice/crio-1e3a6bb86f194d37e44cdc93865a28725e353b74f07c50cf694680cc902fe52a WatchSource:0}: Error finding container 1e3a6bb86f194d37e44cdc93865a28725e353b74f07c50cf694680cc902fe52a: Status 404 returned error can't find the container with id 1e3a6bb86f194d37e44cdc93865a28725e353b74f07c50cf694680cc902fe52a May 11 20:51:48.484273 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.484248 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-c8c4474bc-4m7jz"] May 11 20:51:48.853896 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:48.853862 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-c8c4474bc-4m7jz" event={"ID":"97d80799-b6a0-4fad-bb15-1b33a216aa97","Type":"ContainerStarted","Data":"1e3a6bb86f194d37e44cdc93865a28725e353b74f07c50cf694680cc902fe52a"} May 11 20:51:49.019871 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.019825 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" podUID="bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" containerName="registry" containerID="cri-o://1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621" gracePeriod=30 May 11 20:51:49.268888 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.268859 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:49.337052 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337025 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-installation-pull-secrets\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337199 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337059 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-image-registry-private-configuration\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337199 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337093 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-trusted-ca\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337199 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337123 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpfnq\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-kube-api-access-hpfnq\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337352 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337315 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-certificates\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337407 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337365 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337482 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337462 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-ca-trust-extracted\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337535 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337498 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-bound-sa-token\") pod \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\" (UID: \"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94\") " May 11 20:51:49.337586 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337528 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:51:49.337745 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337721 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:51:49.337809 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.337730 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-trusted-ca\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.340500 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.340473 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-kube-api-access-hpfnq" (OuterVolumeSpecName: "kube-api-access-hpfnq") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "kube-api-access-hpfnq". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:51:49.340619 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.340504 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:51:49.340619 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.340573 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:51:49.340619 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.340607 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:51:49.341245 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.341221 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:51:49.349240 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.349215 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" (UID: "bf01ba23-7a3f-4b4d-8233-4e0819e6bb94"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:51:49.438615 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.438551 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hpfnq\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-kube-api-access-hpfnq\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.438615 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.438576 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-certificates\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.438615 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.438586 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-registry-tls\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.438615 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.438596 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-ca-trust-extracted\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.438615 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.438607 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-bound-sa-token\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.438860 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.438621 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-installation-pull-secrets\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.438860 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.438635 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94-image-registry-private-configuration\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:51:49.858756 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.858702 2567 generic.go:358] "Generic (PLEG): container finished" podID="bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" containerID="1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621" exitCode=0 May 11 20:51:49.858946 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.858749 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" event={"ID":"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94","Type":"ContainerDied","Data":"1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621"} May 11 20:51:49.858946 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.858785 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" May 11 20:51:49.858946 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.858804 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-695f55c9c8-p9zz5" event={"ID":"bf01ba23-7a3f-4b4d-8233-4e0819e6bb94","Type":"ContainerDied","Data":"3de984c7cee21f3cc43ea22ffa889b2e70d4206c2cd47e2c0e34102218b004f4"} May 11 20:51:49.858946 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.858823 2567 scope.go:117] "RemoveContainer" containerID="1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621" May 11 20:51:49.869455 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.869434 2567 scope.go:117] "RemoveContainer" containerID="1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621" May 11 20:51:49.869946 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:51:49.869920 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621\": container with ID starting with 1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621 not found: ID does not exist" containerID="1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621" May 11 20:51:49.870062 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.869972 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621"} err="failed to get container status \"1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621\": rpc error: code = NotFound desc = could not find container \"1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621\": container with ID starting with 1e157bd64c67fc2e22233c72c8f839ba91f9d7b9651283c5ac5edf0dd8a68621 not found: ID does not exist" May 11 20:51:49.880539 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.880518 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-695f55c9c8-p9zz5"] May 11 20:51:49.883916 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:49.883892 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-695f55c9c8-p9zz5"] May 11 20:51:50.479096 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:50.479067 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" path="/var/lib/kubelet/pods/bf01ba23-7a3f-4b4d-8233-4e0819e6bb94/volumes" May 11 20:51:58.741646 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.741610 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84cc9c967-m4qfl"] May 11 20:51:58.742182 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.741927 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" containerName="registry" May 11 20:51:58.742182 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.741945 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" containerName="registry" May 11 20:51:58.742182 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.742027 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf01ba23-7a3f-4b4d-8233-4e0819e6bb94" containerName="registry" May 11 20:51:58.751479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.751451 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:58.753025 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.753003 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84cc9c967-m4qfl"] May 11 20:51:58.754444 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.754424 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" May 11 20:51:58.754552 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.754467 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" May 11 20:51:58.755591 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.755567 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-v587w\"" May 11 20:51:58.755591 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.755588 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" May 11 20:51:58.755755 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.755645 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" May 11 20:51:58.755755 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.755588 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" May 11 20:51:58.915093 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.915060 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-serving-cert\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:58.915093 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.915093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-oauth-serving-cert\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:58.915297 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.915135 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-config\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:58.915297 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.915165 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-service-ca\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:58.915297 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.915257 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-oauth-config\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:58.915428 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:58.915296 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhww\" (UniqueName: \"kubernetes.io/projected/c9cf387a-c4a8-4998-b033-a3346f3997eb-kube-api-access-5dhww\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.015914 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-serving-cert\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.015954 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-oauth-serving-cert\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016223 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.016023 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-config\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016223 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.016048 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-service-ca\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016223 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.016195 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-oauth-config\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016369 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.016233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhww\" (UniqueName: \"kubernetes.io/projected/c9cf387a-c4a8-4998-b033-a3346f3997eb-kube-api-access-5dhww\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016912 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.016743 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-oauth-serving-cert\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016912 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.016850 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-service-ca\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.016912 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.016854 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-config\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.018701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.018679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-serving-cert\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.018815 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.018801 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-oauth-config\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.024849 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.024825 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhww\" (UniqueName: \"kubernetes.io/projected/c9cf387a-c4a8-4998-b033-a3346f3997eb-kube-api-access-5dhww\") pod \"console-84cc9c967-m4qfl\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:51:59.063670 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:51:59.063644 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:52:03.798351 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:03.798325 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84cc9c967-m4qfl"] May 11 20:52:03.800694 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:52:03.800665 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9cf387a_c4a8_4998_b033_a3346f3997eb.slice/crio-60c93a71e9c1bb5953dd8c051136e1dc3f1947b5206d19900a805d59ccb9078a WatchSource:0}: Error finding container 60c93a71e9c1bb5953dd8c051136e1dc3f1947b5206d19900a805d59ccb9078a: Status 404 returned error can't find the container with id 60c93a71e9c1bb5953dd8c051136e1dc3f1947b5206d19900a805d59ccb9078a May 11 20:52:03.898921 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:03.898885 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84cc9c967-m4qfl" event={"ID":"c9cf387a-c4a8-4998-b033-a3346f3997eb","Type":"ContainerStarted","Data":"60c93a71e9c1bb5953dd8c051136e1dc3f1947b5206d19900a805d59ccb9078a"} May 11 20:52:04.811924 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:04.811534 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m9tgf" May 11 20:52:04.904657 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:04.904601 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-c8c4474bc-4m7jz" event={"ID":"97d80799-b6a0-4fad-bb15-1b33a216aa97","Type":"ContainerStarted","Data":"bb3263e18ab9f32219f489c57ee9b1b6bd0eba421d6dc861677316d1be1e2171"} May 11 20:52:04.905529 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:04.905501 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-c8c4474bc-4m7jz" May 11 20:52:04.918008 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:04.917981 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-c8c4474bc-4m7jz" May 11 20:52:04.923251 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:04.923203 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-c8c4474bc-4m7jz" podStartSLOduration=1.343351965 podStartE2EDuration="16.923186426s" podCreationTimestamp="2026-05-11 20:51:48 +0000 UTC" firstStartedPulling="2026-05-11 20:51:48.485568112 +0000 UTC m=+86.595348535" lastFinishedPulling="2026-05-11 20:52:04.065402558 +0000 UTC m=+102.175182996" observedRunningTime="2026-05-11 20:52:04.921177376 +0000 UTC m=+103.030957822" watchObservedRunningTime="2026-05-11 20:52:04.923186426 +0000 UTC m=+103.032966878" May 11 20:52:07.919823 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:07.919784 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84cc9c967-m4qfl" event={"ID":"c9cf387a-c4a8-4998-b033-a3346f3997eb","Type":"ContainerStarted","Data":"8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f"} May 11 20:52:07.937274 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:07.937230 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84cc9c967-m4qfl" podStartSLOduration=6.427492223 podStartE2EDuration="9.937216018s" podCreationTimestamp="2026-05-11 20:51:58 +0000 UTC" firstStartedPulling="2026-05-11 20:52:03.802859073 +0000 UTC m=+101.912639499" lastFinishedPulling="2026-05-11 20:52:07.312582853 +0000 UTC m=+105.422363294" observedRunningTime="2026-05-11 20:52:07.935292526 +0000 UTC m=+106.045072985" watchObservedRunningTime="2026-05-11 20:52:07.937216018 +0000 UTC m=+106.046996463" May 11 20:52:08.320093 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.320055 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-88b88d4cf-nlcxw"] May 11 20:52:08.338119 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.338090 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-88b88d4cf-nlcxw"] May 11 20:52:08.338268 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.338157 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.353037 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.353016 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" May 11 20:52:08.497380 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.497346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-oauth-config\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.497380 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.497389 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-serving-cert\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.497621 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.497429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-config\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.497621 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.497469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-service-ca\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.497621 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.497498 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-oauth-serving-cert\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.497621 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.497524 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-trusted-ca-bundle\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.497621 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.497554 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrs2b\" (UniqueName: \"kubernetes.io/projected/0234f951-13b4-45ba-8ef4-670e80fd5a9c-kube-api-access-jrs2b\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.598391 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.598313 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-oauth-config\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.598552 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.598387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-serving-cert\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.598552 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.598475 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-config\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.598552 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.598524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-service-ca\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.598719 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.598560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-oauth-serving-cert\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.598719 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.598598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-trusted-ca-bundle\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.598719 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.598634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrs2b\" (UniqueName: \"kubernetes.io/projected/0234f951-13b4-45ba-8ef4-670e80fd5a9c-kube-api-access-jrs2b\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.599724 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.599297 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-service-ca\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.599724 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.599317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-config\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.600087 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.600045 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-trusted-ca-bundle\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.600173 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.600147 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-oauth-serving-cert\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.601424 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.601398 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-oauth-config\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.601521 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.601411 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-serving-cert\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.608088 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.608065 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrs2b\" (UniqueName: \"kubernetes.io/projected/0234f951-13b4-45ba-8ef4-670e80fd5a9c-kube-api-access-jrs2b\") pod \"console-88b88d4cf-nlcxw\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.666018 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.665990 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:08.813657 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.813628 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-88b88d4cf-nlcxw"] May 11 20:52:08.816150 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:52:08.816122 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0234f951_13b4_45ba_8ef4_670e80fd5a9c.slice/crio-02bfe020b39b9346cf4fd92e3c23b0f5cee55225d4a53cef3d758ffd1acb66a6 WatchSource:0}: Error finding container 02bfe020b39b9346cf4fd92e3c23b0f5cee55225d4a53cef3d758ffd1acb66a6: Status 404 returned error can't find the container with id 02bfe020b39b9346cf4fd92e3c23b0f5cee55225d4a53cef3d758ffd1acb66a6 May 11 20:52:08.923568 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:08.923541 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b88d4cf-nlcxw" event={"ID":"0234f951-13b4-45ba-8ef4-670e80fd5a9c","Type":"ContainerStarted","Data":"02bfe020b39b9346cf4fd92e3c23b0f5cee55225d4a53cef3d758ffd1acb66a6"} May 11 20:52:09.064355 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:09.064323 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:52:09.064508 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:09.064375 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:52:09.069323 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:09.069295 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:52:09.928164 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:09.928124 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b88d4cf-nlcxw" event={"ID":"0234f951-13b4-45ba-8ef4-670e80fd5a9c","Type":"ContainerStarted","Data":"aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54"} May 11 20:52:09.932719 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:09.932695 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:52:09.945889 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:09.945846 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-88b88d4cf-nlcxw" podStartSLOduration=1.945833814 podStartE2EDuration="1.945833814s" podCreationTimestamp="2026-05-11 20:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:52:09.945360066 +0000 UTC m=+108.055140512" watchObservedRunningTime="2026-05-11 20:52:09.945833814 +0000 UTC m=+108.055614258" May 11 20:52:18.667114 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:18.667081 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:18.667114 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:18.667116 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:18.671652 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:18.671632 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:18.955072 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:18.954999 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:52:19.033779 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:19.033754 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84cc9c967-m4qfl"] May 11 20:52:44.053155 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.053099 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84cc9c967-m4qfl" podUID="c9cf387a-c4a8-4998-b033-a3346f3997eb" containerName="console" containerID="cri-o://8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f" gracePeriod=15 May 11 20:52:44.317657 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.317637 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84cc9c967-m4qfl_c9cf387a-c4a8-4998-b033-a3346f3997eb/console/0.log" May 11 20:52:44.317757 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.317693 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:52:44.346345 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346322 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-serving-cert\") pod \"c9cf387a-c4a8-4998-b033-a3346f3997eb\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " May 11 20:52:44.346478 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346365 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-oauth-serving-cert\") pod \"c9cf387a-c4a8-4998-b033-a3346f3997eb\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " May 11 20:52:44.346478 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346413 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-config\") pod \"c9cf387a-c4a8-4998-b033-a3346f3997eb\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " May 11 20:52:44.346478 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346440 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-oauth-config\") pod \"c9cf387a-c4a8-4998-b033-a3346f3997eb\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " May 11 20:52:44.346478 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346474 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dhww\" (UniqueName: \"kubernetes.io/projected/c9cf387a-c4a8-4998-b033-a3346f3997eb-kube-api-access-5dhww\") pod \"c9cf387a-c4a8-4998-b033-a3346f3997eb\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " May 11 20:52:44.346696 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346519 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-service-ca\") pod \"c9cf387a-c4a8-4998-b033-a3346f3997eb\" (UID: \"c9cf387a-c4a8-4998-b033-a3346f3997eb\") " May 11 20:52:44.346939 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346810 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c9cf387a-c4a8-4998-b033-a3346f3997eb" (UID: "c9cf387a-c4a8-4998-b033-a3346f3997eb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:52:44.346939 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.346918 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-config" (OuterVolumeSpecName: "console-config") pod "c9cf387a-c4a8-4998-b033-a3346f3997eb" (UID: "c9cf387a-c4a8-4998-b033-a3346f3997eb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:52:44.347205 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.347165 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-service-ca" (OuterVolumeSpecName: "service-ca") pod "c9cf387a-c4a8-4998-b033-a3346f3997eb" (UID: "c9cf387a-c4a8-4998-b033-a3346f3997eb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:52:44.348801 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.348767 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c9cf387a-c4a8-4998-b033-a3346f3997eb" (UID: "c9cf387a-c4a8-4998-b033-a3346f3997eb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:52:44.349011 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.348987 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c9cf387a-c4a8-4998-b033-a3346f3997eb" (UID: "c9cf387a-c4a8-4998-b033-a3346f3997eb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:52:44.349064 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.349001 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cf387a-c4a8-4998-b033-a3346f3997eb-kube-api-access-5dhww" (OuterVolumeSpecName: "kube-api-access-5dhww") pod "c9cf387a-c4a8-4998-b033-a3346f3997eb" (UID: "c9cf387a-c4a8-4998-b033-a3346f3997eb"). InnerVolumeSpecName "kube-api-access-5dhww". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:52:44.447328 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.447297 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-service-ca\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:52:44.447328 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.447323 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:52:44.447510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.447337 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-oauth-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:52:44.447510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.447350 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:52:44.447510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.447363 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c9cf387a-c4a8-4998-b033-a3346f3997eb-console-oauth-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:52:44.447510 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:44.447374 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dhww\" (UniqueName: \"kubernetes.io/projected/c9cf387a-c4a8-4998-b033-a3346f3997eb-kube-api-access-5dhww\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:52:45.017748 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.017722 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84cc9c967-m4qfl_c9cf387a-c4a8-4998-b033-a3346f3997eb/console/0.log" May 11 20:52:45.017894 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.017759 2567 generic.go:358] "Generic (PLEG): container finished" podID="c9cf387a-c4a8-4998-b033-a3346f3997eb" containerID="8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f" exitCode=2 May 11 20:52:45.017894 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.017788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84cc9c967-m4qfl" event={"ID":"c9cf387a-c4a8-4998-b033-a3346f3997eb","Type":"ContainerDied","Data":"8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f"} May 11 20:52:45.017894 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.017824 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84cc9c967-m4qfl" May 11 20:52:45.017894 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.017835 2567 scope.go:117] "RemoveContainer" containerID="8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f" May 11 20:52:45.018100 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.017826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84cc9c967-m4qfl" event={"ID":"c9cf387a-c4a8-4998-b033-a3346f3997eb","Type":"ContainerDied","Data":"60c93a71e9c1bb5953dd8c051136e1dc3f1947b5206d19900a805d59ccb9078a"} May 11 20:52:45.025353 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.025339 2567 scope.go:117] "RemoveContainer" containerID="8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f" May 11 20:52:45.025606 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:52:45.025588 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f\": container with ID starting with 8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f not found: ID does not exist" containerID="8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f" May 11 20:52:45.025660 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.025613 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f"} err="failed to get container status \"8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f\": rpc error: code = NotFound desc = could not find container \"8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f\": container with ID starting with 8a4747e52f0bdef057e68fb762203852b80cca262192e987feb64c85e1b3513f not found: ID does not exist" May 11 20:52:45.033664 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.033644 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84cc9c967-m4qfl"] May 11 20:52:45.037185 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:45.037169 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84cc9c967-m4qfl"] May 11 20:52:46.478297 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:52:46.478270 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cf387a-c4a8-4998-b033-a3346f3997eb" path="/var/lib/kubelet/pods/c9cf387a-c4a8-4998-b033-a3346f3997eb/volumes" May 11 20:53:09.123222 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.123192 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cc6bffd59-j27hg"] May 11 20:53:09.123683 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.123454 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9cf387a-c4a8-4998-b033-a3346f3997eb" containerName="console" May 11 20:53:09.123683 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.123466 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cf387a-c4a8-4998-b033-a3346f3997eb" containerName="console" May 11 20:53:09.123683 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.123510 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9cf387a-c4a8-4998-b033-a3346f3997eb" containerName="console" May 11 20:53:09.126231 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.126215 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.137763 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.137739 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc6bffd59-j27hg"] May 11 20:53:09.221131 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.221106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-serving-cert\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.221264 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.221142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-oauth-serving-cert\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.221264 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.221158 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-config\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.221264 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.221207 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-oauth-config\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.221389 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.221265 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bb6t\" (UniqueName: \"kubernetes.io/projected/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-kube-api-access-2bb6t\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.221389 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.221287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-service-ca\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.221389 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.221314 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-trusted-ca-bundle\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.322518 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.322495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-trusted-ca-bundle\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.322614 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.322524 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-serving-cert\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.322614 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.322545 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-oauth-serving-cert\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.322614 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.322563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-config\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.322614 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.322579 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-oauth-config\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.322614 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.322597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bb6t\" (UniqueName: \"kubernetes.io/projected/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-kube-api-access-2bb6t\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.322911 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.322618 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-service-ca\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.323271 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.323251 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-config\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.323349 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.323252 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-oauth-serving-cert\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.323388 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.323340 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-service-ca\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.323571 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.323553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-trusted-ca-bundle\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.325471 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.325449 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-serving-cert\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.325544 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.325476 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-oauth-config\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.331193 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.331170 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bb6t\" (UniqueName: \"kubernetes.io/projected/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-kube-api-access-2bb6t\") pod \"console-cc6bffd59-j27hg\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.435168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.435118 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:09.557479 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:09.557449 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cc6bffd59-j27hg"] May 11 20:53:09.561334 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:53:09.561308 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ed45ea_4a8e_4f48_8128_6b7beeb7a016.slice/crio-90db55d1ecb4a11b232a7a18d837bdaef6bd43ae8d86c6159885610356793f89 WatchSource:0}: Error finding container 90db55d1ecb4a11b232a7a18d837bdaef6bd43ae8d86c6159885610356793f89: Status 404 returned error can't find the container with id 90db55d1ecb4a11b232a7a18d837bdaef6bd43ae8d86c6159885610356793f89 May 11 20:53:10.080123 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:10.080084 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6bffd59-j27hg" event={"ID":"42ed45ea-4a8e-4f48-8128-6b7beeb7a016","Type":"ContainerStarted","Data":"ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535"} May 11 20:53:10.080123 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:10.080121 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6bffd59-j27hg" event={"ID":"42ed45ea-4a8e-4f48-8128-6b7beeb7a016","Type":"ContainerStarted","Data":"90db55d1ecb4a11b232a7a18d837bdaef6bd43ae8d86c6159885610356793f89"} May 11 20:53:10.097476 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:10.097432 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cc6bffd59-j27hg" podStartSLOduration=1.097416174 podStartE2EDuration="1.097416174s" podCreationTimestamp="2026-05-11 20:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:53:10.096318958 +0000 UTC m=+168.206099447" watchObservedRunningTime="2026-05-11 20:53:10.097416174 +0000 UTC m=+168.207196619" May 11 20:53:19.435282 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:19.435243 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:19.435282 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:19.435288 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:19.439660 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:19.439642 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:20.109324 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:20.109292 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:53:20.155824 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:20.155790 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-88b88d4cf-nlcxw"] May 11 20:53:45.175484 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.175448 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-88b88d4cf-nlcxw" podUID="0234f951-13b4-45ba-8ef4-670e80fd5a9c" containerName="console" containerID="cri-o://aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54" gracePeriod=15 May 11 20:53:45.408907 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.408887 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-88b88d4cf-nlcxw_0234f951-13b4-45ba-8ef4-670e80fd5a9c/console/0.log" May 11 20:53:45.409035 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.408944 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:53:45.569176 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569153 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-config\") pod \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " May 11 20:53:45.569319 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569189 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-oauth-serving-cert\") pod \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " May 11 20:53:45.569319 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569218 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrs2b\" (UniqueName: \"kubernetes.io/projected/0234f951-13b4-45ba-8ef4-670e80fd5a9c-kube-api-access-jrs2b\") pod \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " May 11 20:53:45.569319 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569239 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-serving-cert\") pod \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " May 11 20:53:45.569319 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569268 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-service-ca\") pod \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " May 11 20:53:45.569542 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569322 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-trusted-ca-bundle\") pod \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " May 11 20:53:45.569542 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569350 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-oauth-config\") pod \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\" (UID: \"0234f951-13b4-45ba-8ef4-670e80fd5a9c\") " May 11 20:53:45.569639 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569601 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0234f951-13b4-45ba-8ef4-670e80fd5a9c" (UID: "0234f951-13b4-45ba-8ef4-670e80fd5a9c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:45.569639 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569518 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-config" (OuterVolumeSpecName: "console-config") pod "0234f951-13b4-45ba-8ef4-670e80fd5a9c" (UID: "0234f951-13b4-45ba-8ef4-670e80fd5a9c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:45.569757 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569731 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0234f951-13b4-45ba-8ef4-670e80fd5a9c" (UID: "0234f951-13b4-45ba-8ef4-670e80fd5a9c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:45.569840 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.569813 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-service-ca" (OuterVolumeSpecName: "service-ca") pod "0234f951-13b4-45ba-8ef4-670e80fd5a9c" (UID: "0234f951-13b4-45ba-8ef4-670e80fd5a9c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:53:45.571343 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.571317 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0234f951-13b4-45ba-8ef4-670e80fd5a9c-kube-api-access-jrs2b" (OuterVolumeSpecName: "kube-api-access-jrs2b") pod "0234f951-13b4-45ba-8ef4-670e80fd5a9c" (UID: "0234f951-13b4-45ba-8ef4-670e80fd5a9c"). InnerVolumeSpecName "kube-api-access-jrs2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:53:45.571447 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.571375 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0234f951-13b4-45ba-8ef4-670e80fd5a9c" (UID: "0234f951-13b4-45ba-8ef4-670e80fd5a9c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:45.571447 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.571399 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0234f951-13b4-45ba-8ef4-670e80fd5a9c" (UID: "0234f951-13b4-45ba-8ef4-670e80fd5a9c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:53:45.669778 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.669757 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-trusted-ca-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:53:45.669778 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.669777 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-oauth-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:53:45.669908 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.669789 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:53:45.669908 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.669798 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-oauth-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:53:45.669908 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.669807 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrs2b\" (UniqueName: \"kubernetes.io/projected/0234f951-13b4-45ba-8ef4-670e80fd5a9c-kube-api-access-jrs2b\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:53:45.669908 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.669817 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0234f951-13b4-45ba-8ef4-670e80fd5a9c-console-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:53:45.669908 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:45.669826 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0234f951-13b4-45ba-8ef4-670e80fd5a9c-service-ca\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:53:46.172018 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.171992 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-88b88d4cf-nlcxw_0234f951-13b4-45ba-8ef4-670e80fd5a9c/console/0.log" May 11 20:53:46.172172 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.172033 2567 generic.go:358] "Generic (PLEG): container finished" podID="0234f951-13b4-45ba-8ef4-670e80fd5a9c" containerID="aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54" exitCode=2 May 11 20:53:46.172172 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.172069 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b88d4cf-nlcxw" event={"ID":"0234f951-13b4-45ba-8ef4-670e80fd5a9c","Type":"ContainerDied","Data":"aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54"} May 11 20:53:46.172172 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.172097 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-88b88d4cf-nlcxw" May 11 20:53:46.172172 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.172113 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-88b88d4cf-nlcxw" event={"ID":"0234f951-13b4-45ba-8ef4-670e80fd5a9c","Type":"ContainerDied","Data":"02bfe020b39b9346cf4fd92e3c23b0f5cee55225d4a53cef3d758ffd1acb66a6"} May 11 20:53:46.172172 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.172132 2567 scope.go:117] "RemoveContainer" containerID="aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54" May 11 20:53:46.180296 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.180008 2567 scope.go:117] "RemoveContainer" containerID="aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54" May 11 20:53:46.180516 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:53:46.180372 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54\": container with ID starting with aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54 not found: ID does not exist" containerID="aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54" May 11 20:53:46.180516 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.180401 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54"} err="failed to get container status \"aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54\": rpc error: code = NotFound desc = could not find container \"aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54\": container with ID starting with aa5bbc90b97447fdee7bb6440fd2253ed028d8463b7c2072a299fc9cffdefa54 not found: ID does not exist" May 11 20:53:46.193002 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.192977 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-88b88d4cf-nlcxw"] May 11 20:53:46.198694 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.198675 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-88b88d4cf-nlcxw"] May 11 20:53:46.478284 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:53:46.478221 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0234f951-13b4-45ba-8ef4-670e80fd5a9c" path="/var/lib/kubelet/pods/0234f951-13b4-45ba-8ef4-670e80fd5a9c/volumes" May 11 20:54:31.863450 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.863375 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b596b74fc-kt9hj"] May 11 20:54:31.863859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.863608 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0234f951-13b4-45ba-8ef4-670e80fd5a9c" containerName="console" May 11 20:54:31.863859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.863619 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0234f951-13b4-45ba-8ef4-670e80fd5a9c" containerName="console" May 11 20:54:31.863859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.863668 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0234f951-13b4-45ba-8ef4-670e80fd5a9c" containerName="console" May 11 20:54:31.866569 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.866552 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:31.875040 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.875017 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b596b74fc-kt9hj"] May 11 20:54:31.961409 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.961379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-oauth-config\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:31.961409 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.961408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-trusted-ca-bundle\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:31.961561 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.961426 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-serving-cert\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:31.961561 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.961449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-oauth-serving-cert\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:31.961561 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.961539 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-console-config\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:31.961655 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.961570 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5nf5\" (UniqueName: \"kubernetes.io/projected/856bf9b1-3cba-494a-a778-81c13fdab888-kube-api-access-p5nf5\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:31.961655 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:31.961601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-service-ca\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.062695 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.062658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5nf5\" (UniqueName: \"kubernetes.io/projected/856bf9b1-3cba-494a-a778-81c13fdab888-kube-api-access-p5nf5\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.062695 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.062700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-service-ca\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.062835 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.062816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-oauth-config\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.062869 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.062840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-trusted-ca-bundle\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.062869 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.062860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-serving-cert\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.062947 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.062880 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-oauth-serving-cert\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.062947 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.062940 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-console-config\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.063375 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.063351 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-service-ca\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.063514 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.063495 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-oauth-serving-cert\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.063738 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.063716 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-trusted-ca-bundle\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.063806 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.063716 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-console-config\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.065268 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.065239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-serving-cert\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.065352 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.065279 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-oauth-config\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.070363 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.070340 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5nf5\" (UniqueName: \"kubernetes.io/projected/856bf9b1-3cba-494a-a778-81c13fdab888-kube-api-access-p5nf5\") pod \"console-7b596b74fc-kt9hj\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.176373 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.176312 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:32.296249 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:32.296222 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b596b74fc-kt9hj"] May 11 20:54:32.299117 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:54:32.299087 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod856bf9b1_3cba_494a_a778_81c13fdab888.slice/crio-1309458bdb834a2db4edd4d5bc087e37313c6396e658bf84efeb7366e1bc41e4 WatchSource:0}: Error finding container 1309458bdb834a2db4edd4d5bc087e37313c6396e658bf84efeb7366e1bc41e4: Status 404 returned error can't find the container with id 1309458bdb834a2db4edd4d5bc087e37313c6396e658bf84efeb7366e1bc41e4 May 11 20:54:33.293701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:33.293663 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b596b74fc-kt9hj" event={"ID":"856bf9b1-3cba-494a-a778-81c13fdab888","Type":"ContainerStarted","Data":"6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e"} May 11 20:54:33.293701 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:33.293698 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b596b74fc-kt9hj" event={"ID":"856bf9b1-3cba-494a-a778-81c13fdab888","Type":"ContainerStarted","Data":"1309458bdb834a2db4edd4d5bc087e37313c6396e658bf84efeb7366e1bc41e4"} May 11 20:54:33.311712 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:33.311670 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b596b74fc-kt9hj" podStartSLOduration=2.311656742 podStartE2EDuration="2.311656742s" podCreationTimestamp="2026-05-11 20:54:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:54:33.310951657 +0000 UTC m=+251.420732104" watchObservedRunningTime="2026-05-11 20:54:33.311656742 +0000 UTC m=+251.421437187" May 11 20:54:42.176901 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:42.176865 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:42.177402 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:42.176945 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:42.181708 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:42.181687 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:42.323843 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:42.323819 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:54:42.386908 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:54:42.386882 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cc6bffd59-j27hg"] May 11 20:55:07.405039 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.404980 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cc6bffd59-j27hg" podUID="42ed45ea-4a8e-4f48-8128-6b7beeb7a016" containerName="console" containerID="cri-o://ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535" gracePeriod=15 May 11 20:55:07.644788 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.644767 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cc6bffd59-j27hg_42ed45ea-4a8e-4f48-8128-6b7beeb7a016/console/0.log" May 11 20:55:07.644891 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.644823 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:55:07.705817 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.705758 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-config\") pod \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " May 11 20:55:07.705817 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.705793 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bb6t\" (UniqueName: \"kubernetes.io/projected/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-kube-api-access-2bb6t\") pod \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " May 11 20:55:07.705817 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.705817 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-oauth-config\") pod \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " May 11 20:55:07.706007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.705880 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-oauth-serving-cert\") pod \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " May 11 20:55:07.706007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.705895 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-trusted-ca-bundle\") pod \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " May 11 20:55:07.706007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.705910 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-service-ca\") pod \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " May 11 20:55:07.706007 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.705934 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-serving-cert\") pod \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\" (UID: \"42ed45ea-4a8e-4f48-8128-6b7beeb7a016\") " May 11 20:55:07.706279 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.706253 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-config" (OuterVolumeSpecName: "console-config") pod "42ed45ea-4a8e-4f48-8128-6b7beeb7a016" (UID: "42ed45ea-4a8e-4f48-8128-6b7beeb7a016"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:55:07.706352 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.706327 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42ed45ea-4a8e-4f48-8128-6b7beeb7a016" (UID: "42ed45ea-4a8e-4f48-8128-6b7beeb7a016"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:55:07.706454 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.706433 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-service-ca" (OuterVolumeSpecName: "service-ca") pod "42ed45ea-4a8e-4f48-8128-6b7beeb7a016" (UID: "42ed45ea-4a8e-4f48-8128-6b7beeb7a016"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:55:07.706513 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.706489 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42ed45ea-4a8e-4f48-8128-6b7beeb7a016" (UID: "42ed45ea-4a8e-4f48-8128-6b7beeb7a016"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:55:07.707880 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.707853 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-kube-api-access-2bb6t" (OuterVolumeSpecName: "kube-api-access-2bb6t") pod "42ed45ea-4a8e-4f48-8128-6b7beeb7a016" (UID: "42ed45ea-4a8e-4f48-8128-6b7beeb7a016"). InnerVolumeSpecName "kube-api-access-2bb6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:55:07.708031 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.708010 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42ed45ea-4a8e-4f48-8128-6b7beeb7a016" (UID: "42ed45ea-4a8e-4f48-8128-6b7beeb7a016"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:55:07.708081 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.708032 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42ed45ea-4a8e-4f48-8128-6b7beeb7a016" (UID: "42ed45ea-4a8e-4f48-8128-6b7beeb7a016"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:55:07.807110 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.807085 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-oauth-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:07.807110 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.807107 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-trusted-ca-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:07.807110 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.807116 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-service-ca\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:07.807280 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.807125 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:07.807280 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.807136 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:07.807280 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.807145 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2bb6t\" (UniqueName: \"kubernetes.io/projected/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-kube-api-access-2bb6t\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:07.807280 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:07.807155 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ed45ea-4a8e-4f48-8128-6b7beeb7a016-console-oauth-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:08.384480 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.384454 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cc6bffd59-j27hg_42ed45ea-4a8e-4f48-8128-6b7beeb7a016/console/0.log" May 11 20:55:08.384629 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.384491 2567 generic.go:358] "Generic (PLEG): container finished" podID="42ed45ea-4a8e-4f48-8128-6b7beeb7a016" containerID="ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535" exitCode=2 May 11 20:55:08.384629 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.384520 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6bffd59-j27hg" event={"ID":"42ed45ea-4a8e-4f48-8128-6b7beeb7a016","Type":"ContainerDied","Data":"ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535"} May 11 20:55:08.384629 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.384558 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cc6bffd59-j27hg" event={"ID":"42ed45ea-4a8e-4f48-8128-6b7beeb7a016","Type":"ContainerDied","Data":"90db55d1ecb4a11b232a7a18d837bdaef6bd43ae8d86c6159885610356793f89"} May 11 20:55:08.384629 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.384562 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cc6bffd59-j27hg" May 11 20:55:08.384629 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.384573 2567 scope.go:117] "RemoveContainer" containerID="ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535" May 11 20:55:08.392441 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.392422 2567 scope.go:117] "RemoveContainer" containerID="ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535" May 11 20:55:08.392709 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:55:08.392690 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535\": container with ID starting with ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535 not found: ID does not exist" containerID="ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535" May 11 20:55:08.392790 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.392716 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535"} err="failed to get container status \"ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535\": rpc error: code = NotFound desc = could not find container \"ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535\": container with ID starting with ceced453fc743793938e5039586c37ace40dd569b8fee69b48101d1552447535 not found: ID does not exist" May 11 20:55:08.405036 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.405010 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cc6bffd59-j27hg"] May 11 20:55:08.407520 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.407504 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cc6bffd59-j27hg"] May 11 20:55:08.482302 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:08.479200 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ed45ea-4a8e-4f48-8128-6b7beeb7a016" path="/var/lib/kubelet/pods/42ed45ea-4a8e-4f48-8128-6b7beeb7a016/volumes" May 11 20:55:19.627600 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.627570 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49"] May 11 20:55:19.628081 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.627798 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42ed45ea-4a8e-4f48-8128-6b7beeb7a016" containerName="console" May 11 20:55:19.628081 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.627809 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ed45ea-4a8e-4f48-8128-6b7beeb7a016" containerName="console" May 11 20:55:19.628081 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.627859 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="42ed45ea-4a8e-4f48-8128-6b7beeb7a016" containerName="console" May 11 20:55:19.632111 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.632095 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.634827 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.634798 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" May 11 20:55:19.634934 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.634809 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hst2x\"" May 11 20:55:19.634934 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.634810 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" May 11 20:55:19.638767 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.638750 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49"] May 11 20:55:19.683576 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.683555 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.683700 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.683589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.683700 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.683608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xstw8\" (UniqueName: \"kubernetes.io/projected/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-kube-api-access-xstw8\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.784439 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.784399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.784439 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.784445 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.784589 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.784463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstw8\" (UniqueName: \"kubernetes.io/projected/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-kube-api-access-xstw8\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.784765 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.784749 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.784804 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.784779 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.792518 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.792497 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xstw8\" (UniqueName: \"kubernetes.io/projected/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-kube-api-access-xstw8\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:19.942023 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:19.941949 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:20.060073 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:20.060037 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49"] May 11 20:55:20.062639 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:55:20.062615 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6f3dfe8_367c_49e6_b664_97ed6e03c6bf.slice/crio-cde1dd43b95b7c82be7485b476c6ea0f44c49acc7e5639612eed5e134734a257 WatchSource:0}: Error finding container cde1dd43b95b7c82be7485b476c6ea0f44c49acc7e5639612eed5e134734a257: Status 404 returned error can't find the container with id cde1dd43b95b7c82be7485b476c6ea0f44c49acc7e5639612eed5e134734a257 May 11 20:55:20.419019 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:20.418989 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" event={"ID":"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf","Type":"ContainerStarted","Data":"cde1dd43b95b7c82be7485b476c6ea0f44c49acc7e5639612eed5e134734a257"} May 11 20:55:22.393981 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:22.393939 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 20:55:22.398065 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:22.397991 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 20:55:22.403291 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:22.403238 2567 kubelet.go:1628] "Image garbage collection succeeded" May 11 20:55:25.436661 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:25.436589 2567 generic.go:358] "Generic (PLEG): container finished" podID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerID="6b996bcdd9a2e5901f8a932a7be08cfb3a7eb897649b28903f5f01487aa3f7ea" exitCode=0 May 11 20:55:25.436661 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:25.436631 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" event={"ID":"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf","Type":"ContainerDied","Data":"6b996bcdd9a2e5901f8a932a7be08cfb3a7eb897649b28903f5f01487aa3f7ea"} May 11 20:55:25.437609 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:25.437595 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 20:55:28.446660 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:28.446627 2567 generic.go:358] "Generic (PLEG): container finished" podID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerID="703d22bc26506224dfad2dc2534fe4fc0cd9944747cda6af8dd55919b0eab815" exitCode=0 May 11 20:55:28.447031 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:28.446684 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" event={"ID":"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf","Type":"ContainerDied","Data":"703d22bc26506224dfad2dc2534fe4fc0cd9944747cda6af8dd55919b0eab815"} May 11 20:55:34.465579 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:34.465546 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" event={"ID":"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf","Type":"ContainerStarted","Data":"53a2ae0fea38ec3cc13d078a971129973fe96566def85c87cc15514584221b78"} May 11 20:55:34.482737 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:34.482695 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" podStartSLOduration=1.193968894 podStartE2EDuration="15.48268114s" podCreationTimestamp="2026-05-11 20:55:19 +0000 UTC" firstStartedPulling="2026-05-11 20:55:20.064438253 +0000 UTC m=+298.174218691" lastFinishedPulling="2026-05-11 20:55:34.353150511 +0000 UTC m=+312.462930937" observedRunningTime="2026-05-11 20:55:34.480799616 +0000 UTC m=+312.590580052" watchObservedRunningTime="2026-05-11 20:55:34.48268114 +0000 UTC m=+312.592461585" May 11 20:55:35.470123 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:35.470090 2567 generic.go:358] "Generic (PLEG): container finished" podID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerID="53a2ae0fea38ec3cc13d078a971129973fe96566def85c87cc15514584221b78" exitCode=0 May 11 20:55:35.470488 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:35.470172 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" event={"ID":"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf","Type":"ContainerDied","Data":"53a2ae0fea38ec3cc13d078a971129973fe96566def85c87cc15514584221b78"} May 11 20:55:36.598167 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.598146 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:36.714364 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.714339 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-bundle\") pod \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " May 11 20:55:36.714504 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.714397 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xstw8\" (UniqueName: \"kubernetes.io/projected/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-kube-api-access-xstw8\") pod \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " May 11 20:55:36.714504 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.714465 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-util\") pod \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\" (UID: \"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf\") " May 11 20:55:36.715071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.715039 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-bundle" (OuterVolumeSpecName: "bundle") pod "a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" (UID: "a6f3dfe8-367c-49e6-b664-97ed6e03c6bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:55:36.716528 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.716508 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-kube-api-access-xstw8" (OuterVolumeSpecName: "kube-api-access-xstw8") pod "a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" (UID: "a6f3dfe8-367c-49e6-b664-97ed6e03c6bf"). InnerVolumeSpecName "kube-api-access-xstw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:55:36.719172 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.719149 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-util" (OuterVolumeSpecName: "util") pod "a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" (UID: "a6f3dfe8-367c-49e6-b664-97ed6e03c6bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:55:36.815484 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.815464 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xstw8\" (UniqueName: \"kubernetes.io/projected/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-kube-api-access-xstw8\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:36.815579 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.815486 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:36.815579 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:36.815496 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6f3dfe8-367c-49e6-b664-97ed6e03c6bf-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:37.476378 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:37.476350 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" event={"ID":"a6f3dfe8-367c-49e6-b664-97ed6e03c6bf","Type":"ContainerDied","Data":"cde1dd43b95b7c82be7485b476c6ea0f44c49acc7e5639612eed5e134734a257"} May 11 20:55:37.476378 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:37.476378 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cde1dd43b95b7c82be7485b476c6ea0f44c49acc7e5639612eed5e134734a257" May 11 20:55:37.476588 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:37.476398 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dbqx49" May 11 20:55:42.511738 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.511702 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz"] May 11 20:55:42.512278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.511971 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerName="extract" May 11 20:55:42.512278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.511984 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerName="extract" May 11 20:55:42.512278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.511999 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerName="pull" May 11 20:55:42.512278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.512009 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerName="pull" May 11 20:55:42.512278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.512025 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerName="util" May 11 20:55:42.512278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.512031 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerName="util" May 11 20:55:42.512278 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.512073 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6f3dfe8-367c-49e6-b664-97ed6e03c6bf" containerName="extract" May 11 20:55:42.561722 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.561691 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz"] May 11 20:55:42.561858 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.561795 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.565934 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.565913 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" May 11 20:55:42.566059 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.566039 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-662cs\"" May 11 20:55:42.566132 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.566115 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" May 11 20:55:42.654847 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.654822 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kds9g\" (UniqueName: \"kubernetes.io/projected/0e8050ad-62ca-434e-af7b-ee7b0a77e264-kube-api-access-kds9g\") pod \"cert-manager-operator-controller-manager-54b9655956-wmhzz\" (UID: \"0e8050ad-62ca-434e-af7b-ee7b0a77e264\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.654972 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.654863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e8050ad-62ca-434e-af7b-ee7b0a77e264-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-wmhzz\" (UID: \"0e8050ad-62ca-434e-af7b-ee7b0a77e264\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.756121 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.756095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kds9g\" (UniqueName: \"kubernetes.io/projected/0e8050ad-62ca-434e-af7b-ee7b0a77e264-kube-api-access-kds9g\") pod \"cert-manager-operator-controller-manager-54b9655956-wmhzz\" (UID: \"0e8050ad-62ca-434e-af7b-ee7b0a77e264\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.756215 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.756131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e8050ad-62ca-434e-af7b-ee7b0a77e264-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-wmhzz\" (UID: \"0e8050ad-62ca-434e-af7b-ee7b0a77e264\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.756421 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.756403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e8050ad-62ca-434e-af7b-ee7b0a77e264-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-wmhzz\" (UID: \"0e8050ad-62ca-434e-af7b-ee7b0a77e264\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.764251 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.764206 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kds9g\" (UniqueName: \"kubernetes.io/projected/0e8050ad-62ca-434e-af7b-ee7b0a77e264-kube-api-access-kds9g\") pod \"cert-manager-operator-controller-manager-54b9655956-wmhzz\" (UID: \"0e8050ad-62ca-434e-af7b-ee7b0a77e264\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.871062 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.871043 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" May 11 20:55:42.995674 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:42.995640 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz"] May 11 20:55:43.000147 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:55:43.000120 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e8050ad_62ca_434e_af7b_ee7b0a77e264.slice/crio-bb7c30c900d8827e43f3cd031e52577e5801223546d4d89622fc7755893524b4 WatchSource:0}: Error finding container bb7c30c900d8827e43f3cd031e52577e5801223546d4d89622fc7755893524b4: Status 404 returned error can't find the container with id bb7c30c900d8827e43f3cd031e52577e5801223546d4d89622fc7755893524b4 May 11 20:55:43.491481 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:43.491449 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" event={"ID":"0e8050ad-62ca-434e-af7b-ee7b0a77e264","Type":"ContainerStarted","Data":"bb7c30c900d8827e43f3cd031e52577e5801223546d4d89622fc7755893524b4"} May 11 20:55:45.499419 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:45.499387 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" event={"ID":"0e8050ad-62ca-434e-af7b-ee7b0a77e264","Type":"ContainerStarted","Data":"4c04a419046367eb4c064738e3c88f17500cab6441ab788cebd377b6074d8ccb"} May 11 20:55:45.526672 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:45.526614 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wmhzz" podStartSLOduration=1.8652592970000001 podStartE2EDuration="3.526600508s" podCreationTimestamp="2026-05-11 20:55:42 +0000 UTC" firstStartedPulling="2026-05-11 20:55:43.002543707 +0000 UTC m=+321.112324130" lastFinishedPulling="2026-05-11 20:55:44.663884915 +0000 UTC m=+322.773665341" observedRunningTime="2026-05-11 20:55:45.525153067 +0000 UTC m=+323.634933512" watchObservedRunningTime="2026-05-11 20:55:45.526600508 +0000 UTC m=+323.636380953" May 11 20:55:46.648929 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.648878 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2"] May 11 20:55:46.651148 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.651133 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.655166 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.655145 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" May 11 20:55:46.655274 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.655212 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hst2x\"" May 11 20:55:46.656221 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.656191 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" May 11 20:55:46.672411 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.672386 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2"] May 11 20:55:46.782625 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.782596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.782749 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.782640 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f28t\" (UniqueName: \"kubernetes.io/projected/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-kube-api-access-6f28t\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.782749 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.782732 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.882994 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.882949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f28t\" (UniqueName: \"kubernetes.io/projected/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-kube-api-access-6f28t\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.883097 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.883016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.883097 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.883049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.883327 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.883311 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.883397 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.883358 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.890564 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.890546 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f28t\" (UniqueName: \"kubernetes.io/projected/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-kube-api-access-6f28t\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:46.959616 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:46.959559 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:47.077118 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:47.077065 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2"] May 11 20:55:47.079547 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:55:47.079521 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc47d3b9e_7bf2_4ebc_a2f4_f3d84806d2f0.slice/crio-cf502199a5604c809d7c0d482925f0c5a4dbc7fee53fec902641605c4c953fa6 WatchSource:0}: Error finding container cf502199a5604c809d7c0d482925f0c5a4dbc7fee53fec902641605c4c953fa6: Status 404 returned error can't find the container with id cf502199a5604c809d7c0d482925f0c5a4dbc7fee53fec902641605c4c953fa6 May 11 20:55:47.505878 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:47.505846 2567 generic.go:358] "Generic (PLEG): container finished" podID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerID="22ce188977e25c78fcb8bd349f0cbff77ce76aeb4d7d5594bc169c8bbeee17f2" exitCode=0 May 11 20:55:47.506048 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:47.505894 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" event={"ID":"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0","Type":"ContainerDied","Data":"22ce188977e25c78fcb8bd349f0cbff77ce76aeb4d7d5594bc169c8bbeee17f2"} May 11 20:55:47.506048 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:47.505919 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" event={"ID":"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0","Type":"ContainerStarted","Data":"cf502199a5604c809d7c0d482925f0c5a4dbc7fee53fec902641605c4c953fa6"} May 11 20:55:50.517765 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:50.517728 2567 generic.go:358] "Generic (PLEG): container finished" podID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerID="e7c8413a28c2369fda7553cea21f4204197ea3a2741baba2bdf271c1588be574" exitCode=0 May 11 20:55:50.518210 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:50.517825 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" event={"ID":"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0","Type":"ContainerDied","Data":"e7c8413a28c2369fda7553cea21f4204197ea3a2741baba2bdf271c1588be574"} May 11 20:55:51.522431 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:51.522390 2567 generic.go:358] "Generic (PLEG): container finished" podID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerID="fa3cb4edd8cfcf0d24f3d84fce0647e7c72d52af2b48d232597d4327639e5e20" exitCode=0 May 11 20:55:51.522431 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:51.522434 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" event={"ID":"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0","Type":"ContainerDied","Data":"fa3cb4edd8cfcf0d24f3d84fce0647e7c72d52af2b48d232597d4327639e5e20"} May 11 20:55:52.663907 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.663885 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:52.846259 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.846229 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-util\") pod \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " May 11 20:55:52.846422 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.846287 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-bundle\") pod \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " May 11 20:55:52.846422 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.846332 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f28t\" (UniqueName: \"kubernetes.io/projected/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-kube-api-access-6f28t\") pod \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\" (UID: \"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0\") " May 11 20:55:52.846690 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.846663 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-bundle" (OuterVolumeSpecName: "bundle") pod "c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" (UID: "c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:55:52.848470 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.848442 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-kube-api-access-6f28t" (OuterVolumeSpecName: "kube-api-access-6f28t") pod "c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" (UID: "c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0"). InnerVolumeSpecName "kube-api-access-6f28t". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:55:52.850657 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.850632 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-util" (OuterVolumeSpecName: "util") pod "c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" (UID: "c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:55:52.947058 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.947026 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:52.947058 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.947054 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:52.947196 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:52.947077 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6f28t\" (UniqueName: \"kubernetes.io/projected/c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0-kube-api-access-6f28t\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:55:53.531074 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:53.531034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" event={"ID":"c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0","Type":"ContainerDied","Data":"cf502199a5604c809d7c0d482925f0c5a4dbc7fee53fec902641605c4c953fa6"} May 11 20:55:53.531074 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:53.531073 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf502199a5604c809d7c0d482925f0c5a4dbc7fee53fec902641605c4c953fa6" May 11 20:55:53.531269 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:53.531115 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgm8g2" May 11 20:55:58.479118 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479091 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2zxxv"] May 11 20:55:58.479466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479317 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerName="pull" May 11 20:55:58.479466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479328 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerName="pull" May 11 20:55:58.479466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479335 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerName="util" May 11 20:55:58.479466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479340 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerName="util" May 11 20:55:58.479466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479354 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerName="extract" May 11 20:55:58.479466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479359 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerName="extract" May 11 20:55:58.479466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.479397 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c47d3b9e-7bf2-4ebc-a2f4-f3d84806d2f0" containerName="extract" May 11 20:55:58.484104 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.484081 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.486574 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.486550 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" May 11 20:55:58.488067 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.487771 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" May 11 20:55:58.488067 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.487849 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-9xlhg\"" May 11 20:55:58.489979 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.489940 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2zxxv"] May 11 20:55:58.580513 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.580486 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b969c4af-aceb-4484-98e5-f79dc19f7208-bound-sa-token\") pod \"cert-manager-79c8d999ff-2zxxv\" (UID: \"b969c4af-aceb-4484-98e5-f79dc19f7208\") " pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.580655 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.580522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg8c\" (UniqueName: \"kubernetes.io/projected/b969c4af-aceb-4484-98e5-f79dc19f7208-kube-api-access-gpg8c\") pod \"cert-manager-79c8d999ff-2zxxv\" (UID: \"b969c4af-aceb-4484-98e5-f79dc19f7208\") " pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.681570 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.681541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b969c4af-aceb-4484-98e5-f79dc19f7208-bound-sa-token\") pod \"cert-manager-79c8d999ff-2zxxv\" (UID: \"b969c4af-aceb-4484-98e5-f79dc19f7208\") " pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.681706 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.681582 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg8c\" (UniqueName: \"kubernetes.io/projected/b969c4af-aceb-4484-98e5-f79dc19f7208-kube-api-access-gpg8c\") pod \"cert-manager-79c8d999ff-2zxxv\" (UID: \"b969c4af-aceb-4484-98e5-f79dc19f7208\") " pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.689526 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.689497 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b969c4af-aceb-4484-98e5-f79dc19f7208-bound-sa-token\") pod \"cert-manager-79c8d999ff-2zxxv\" (UID: \"b969c4af-aceb-4484-98e5-f79dc19f7208\") " pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.689678 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.689648 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg8c\" (UniqueName: \"kubernetes.io/projected/b969c4af-aceb-4484-98e5-f79dc19f7208-kube-api-access-gpg8c\") pod \"cert-manager-79c8d999ff-2zxxv\" (UID: \"b969c4af-aceb-4484-98e5-f79dc19f7208\") " pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.794590 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.794565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2zxxv" May 11 20:55:58.915094 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:58.915069 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2zxxv"] May 11 20:55:58.917150 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:55:58.917123 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb969c4af_aceb_4484_98e5_f79dc19f7208.slice/crio-a8cacbd3cbf6e984678cc0e5be43a8fe8d835fe63e6c01114815139de8f954a0 WatchSource:0}: Error finding container a8cacbd3cbf6e984678cc0e5be43a8fe8d835fe63e6c01114815139de8f954a0: Status 404 returned error can't find the container with id a8cacbd3cbf6e984678cc0e5be43a8fe8d835fe63e6c01114815139de8f954a0 May 11 20:55:59.228373 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.228295 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs"] May 11 20:55:59.233464 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.233448 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.235897 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.235876 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" May 11 20:55:59.236021 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.235877 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-qxp8j\"" May 11 20:55:59.236021 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.235878 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" May 11 20:55:59.238708 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.238686 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs"] May 11 20:55:59.387193 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.387160 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps6pv\" (UniqueName: \"kubernetes.io/projected/b2ea3e22-2203-48e1-adb7-4a8f99a4a05a-kube-api-access-ps6pv\") pod \"openshift-lws-operator-bfc7f696d-2mqhs\" (UID: \"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.387336 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.387224 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2ea3e22-2203-48e1-adb7-4a8f99a4a05a-tmp\") pod \"openshift-lws-operator-bfc7f696d-2mqhs\" (UID: \"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.488150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.488067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps6pv\" (UniqueName: \"kubernetes.io/projected/b2ea3e22-2203-48e1-adb7-4a8f99a4a05a-kube-api-access-ps6pv\") pod \"openshift-lws-operator-bfc7f696d-2mqhs\" (UID: \"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.488150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.488124 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2ea3e22-2203-48e1-adb7-4a8f99a4a05a-tmp\") pod \"openshift-lws-operator-bfc7f696d-2mqhs\" (UID: \"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.488514 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.488426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2ea3e22-2203-48e1-adb7-4a8f99a4a05a-tmp\") pod \"openshift-lws-operator-bfc7f696d-2mqhs\" (UID: \"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.495580 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.495551 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps6pv\" (UniqueName: \"kubernetes.io/projected/b2ea3e22-2203-48e1-adb7-4a8f99a4a05a-kube-api-access-ps6pv\") pod \"openshift-lws-operator-bfc7f696d-2mqhs\" (UID: \"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.543478 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.543453 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" May 11 20:55:59.551747 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.551724 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2zxxv" event={"ID":"b969c4af-aceb-4484-98e5-f79dc19f7208","Type":"ContainerStarted","Data":"a8cacbd3cbf6e984678cc0e5be43a8fe8d835fe63e6c01114815139de8f954a0"} May 11 20:55:59.660829 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:55:59.660752 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs"] May 11 20:55:59.663223 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:55:59.663190 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2ea3e22_2203_48e1_adb7_4a8f99a4a05a.slice/crio-214327e9f71b222aed735395eda6faf55676ee2eda236de2a0875e3d15a46553 WatchSource:0}: Error finding container 214327e9f71b222aed735395eda6faf55676ee2eda236de2a0875e3d15a46553: Status 404 returned error can't find the container with id 214327e9f71b222aed735395eda6faf55676ee2eda236de2a0875e3d15a46553 May 11 20:56:00.556399 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:00.556360 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" event={"ID":"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a","Type":"ContainerStarted","Data":"214327e9f71b222aed735395eda6faf55676ee2eda236de2a0875e3d15a46553"} May 11 20:56:02.563549 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:02.563516 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" event={"ID":"b2ea3e22-2203-48e1-adb7-4a8f99a4a05a","Type":"ContainerStarted","Data":"7810864ff0e97c7b7fdeafb3f1f2f519e60013055ad3ec3ff6b1bd10e62eb6e4"} May 11 20:56:02.564818 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:02.564795 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2zxxv" event={"ID":"b969c4af-aceb-4484-98e5-f79dc19f7208","Type":"ContainerStarted","Data":"26851395b3247bb70e31c6f9b62f96951f6810a5e5900ef35c92dec0a85cb4c7"} May 11 20:56:02.583421 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:02.583374 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2mqhs" podStartSLOduration=1.104892059 podStartE2EDuration="3.583358106s" podCreationTimestamp="2026-05-11 20:55:59 +0000 UTC" firstStartedPulling="2026-05-11 20:55:59.664629553 +0000 UTC m=+337.774409976" lastFinishedPulling="2026-05-11 20:56:02.143095601 +0000 UTC m=+340.252876023" observedRunningTime="2026-05-11 20:56:02.582165143 +0000 UTC m=+340.691945581" watchObservedRunningTime="2026-05-11 20:56:02.583358106 +0000 UTC m=+340.693138553" May 11 20:56:02.597492 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:02.597443 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-2zxxv" podStartSLOduration=1.3747780729999999 podStartE2EDuration="4.597426746s" podCreationTimestamp="2026-05-11 20:55:58 +0000 UTC" firstStartedPulling="2026-05-11 20:55:58.919214696 +0000 UTC m=+337.028995122" lastFinishedPulling="2026-05-11 20:56:02.14186337 +0000 UTC m=+340.251643795" observedRunningTime="2026-05-11 20:56:02.596230666 +0000 UTC m=+340.706011113" watchObservedRunningTime="2026-05-11 20:56:02.597426746 +0000 UTC m=+340.707207192" May 11 20:56:07.016508 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.016475 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j"] May 11 20:56:07.019950 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.019935 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.023119 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.023098 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hst2x\"" May 11 20:56:07.023241 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.023224 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" May 11 20:56:07.024154 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.024139 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" May 11 20:56:07.043226 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.043203 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znq6z\" (UniqueName: \"kubernetes.io/projected/eb436a7a-eac1-44df-a415-9cba37bde756-kube-api-access-znq6z\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.043326 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.043282 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.043379 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.043347 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.044789 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.044757 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j"] May 11 20:56:07.144405 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.144376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znq6z\" (UniqueName: \"kubernetes.io/projected/eb436a7a-eac1-44df-a415-9cba37bde756-kube-api-access-znq6z\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.144537 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.144417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.144537 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.144470 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.144860 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.144841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.144902 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.144866 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.152582 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.152561 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znq6z\" (UniqueName: \"kubernetes.io/projected/eb436a7a-eac1-44df-a415-9cba37bde756-kube-api-access-znq6z\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.328646 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.328620 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:07.453498 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.453439 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j"] May 11 20:56:07.456524 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:07.456494 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb436a7a_eac1_44df_a415_9cba37bde756.slice/crio-7ee214d32d16573bd34062546f21e54402e9263484939b7db825572417871f0a WatchSource:0}: Error finding container 7ee214d32d16573bd34062546f21e54402e9263484939b7db825572417871f0a: Status 404 returned error can't find the container with id 7ee214d32d16573bd34062546f21e54402e9263484939b7db825572417871f0a May 11 20:56:07.581282 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.581218 2567 generic.go:358] "Generic (PLEG): container finished" podID="eb436a7a-eac1-44df-a415-9cba37bde756" containerID="e00ab5ed4d6dec86979c7635719a6b42617277b93120dd14f7a5fbe5114a0e2c" exitCode=0 May 11 20:56:07.581384 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.581303 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" event={"ID":"eb436a7a-eac1-44df-a415-9cba37bde756","Type":"ContainerDied","Data":"e00ab5ed4d6dec86979c7635719a6b42617277b93120dd14f7a5fbe5114a0e2c"} May 11 20:56:07.581384 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:07.581339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" event={"ID":"eb436a7a-eac1-44df-a415-9cba37bde756","Type":"ContainerStarted","Data":"7ee214d32d16573bd34062546f21e54402e9263484939b7db825572417871f0a"} May 11 20:56:08.586014 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:08.585979 2567 generic.go:358] "Generic (PLEG): container finished" podID="eb436a7a-eac1-44df-a415-9cba37bde756" containerID="32a57d19a5f6be2f44e0ef36ce297cd3d9e56ceac75095368c9e1f6a489cb556" exitCode=0 May 11 20:56:08.586360 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:08.586032 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" event={"ID":"eb436a7a-eac1-44df-a415-9cba37bde756","Type":"ContainerDied","Data":"32a57d19a5f6be2f44e0ef36ce297cd3d9e56ceac75095368c9e1f6a489cb556"} May 11 20:56:09.590909 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:09.590874 2567 generic.go:358] "Generic (PLEG): container finished" podID="eb436a7a-eac1-44df-a415-9cba37bde756" containerID="e49650e28a61f027910549648d7e9d2738423d39b42eaebb9ec46c889596e044" exitCode=0 May 11 20:56:09.591361 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:09.590925 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" event={"ID":"eb436a7a-eac1-44df-a415-9cba37bde756","Type":"ContainerDied","Data":"e49650e28a61f027910549648d7e9d2738423d39b42eaebb9ec46c889596e044"} May 11 20:56:10.718815 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.718790 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:10.769435 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.769412 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-bundle\") pod \"eb436a7a-eac1-44df-a415-9cba37bde756\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " May 11 20:56:10.769560 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.769454 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-util\") pod \"eb436a7a-eac1-44df-a415-9cba37bde756\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " May 11 20:56:10.769560 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.769480 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znq6z\" (UniqueName: \"kubernetes.io/projected/eb436a7a-eac1-44df-a415-9cba37bde756-kube-api-access-znq6z\") pod \"eb436a7a-eac1-44df-a415-9cba37bde756\" (UID: \"eb436a7a-eac1-44df-a415-9cba37bde756\") " May 11 20:56:10.770061 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.770028 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-bundle" (OuterVolumeSpecName: "bundle") pod "eb436a7a-eac1-44df-a415-9cba37bde756" (UID: "eb436a7a-eac1-44df-a415-9cba37bde756"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:10.771452 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.771429 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb436a7a-eac1-44df-a415-9cba37bde756-kube-api-access-znq6z" (OuterVolumeSpecName: "kube-api-access-znq6z") pod "eb436a7a-eac1-44df-a415-9cba37bde756" (UID: "eb436a7a-eac1-44df-a415-9cba37bde756"). InnerVolumeSpecName "kube-api-access-znq6z". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:56:10.774804 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.774783 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-util" (OuterVolumeSpecName: "util") pod "eb436a7a-eac1-44df-a415-9cba37bde756" (UID: "eb436a7a-eac1-44df-a415-9cba37bde756"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:10.870052 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.869987 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:10.870052 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.870012 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb436a7a-eac1-44df-a415-9cba37bde756-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:10.870052 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:10.870025 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znq6z\" (UniqueName: \"kubernetes.io/projected/eb436a7a-eac1-44df-a415-9cba37bde756-kube-api-access-znq6z\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:11.601193 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:11.601160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" event={"ID":"eb436a7a-eac1-44df-a415-9cba37bde756","Type":"ContainerDied","Data":"7ee214d32d16573bd34062546f21e54402e9263484939b7db825572417871f0a"} May 11 20:56:11.601193 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:11.601191 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ee214d32d16573bd34062546f21e54402e9263484939b7db825572417871f0a" May 11 20:56:11.601425 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:11.601208 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5h5v4j" May 11 20:56:16.413773 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.413742 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22"] May 11 20:56:16.414189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.413990 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb436a7a-eac1-44df-a415-9cba37bde756" containerName="util" May 11 20:56:16.414189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.414002 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb436a7a-eac1-44df-a415-9cba37bde756" containerName="util" May 11 20:56:16.414189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.414010 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb436a7a-eac1-44df-a415-9cba37bde756" containerName="pull" May 11 20:56:16.414189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.414016 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb436a7a-eac1-44df-a415-9cba37bde756" containerName="pull" May 11 20:56:16.414189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.414031 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb436a7a-eac1-44df-a415-9cba37bde756" containerName="extract" May 11 20:56:16.414189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.414036 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb436a7a-eac1-44df-a415-9cba37bde756" containerName="extract" May 11 20:56:16.414189 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.414084 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb436a7a-eac1-44df-a415-9cba37bde756" containerName="extract" May 11 20:56:16.418694 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.418676 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.421132 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.421110 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" May 11 20:56:16.421238 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.421129 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" May 11 20:56:16.421238 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.421221 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hst2x\"" May 11 20:56:16.425009 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.424947 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22"] May 11 20:56:16.505128 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.505098 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.505128 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.505125 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.505281 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.505218 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbstq\" (UniqueName: \"kubernetes.io/projected/e1bef462-a213-4a98-bb31-6d6082644da0-kube-api-access-kbstq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.606466 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.606441 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbstq\" (UniqueName: \"kubernetes.io/projected/e1bef462-a213-4a98-bb31-6d6082644da0-kube-api-access-kbstq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.606563 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.606469 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.606563 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.606488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.606774 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.606758 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.606831 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.606789 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.615339 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.615320 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbstq\" (UniqueName: \"kubernetes.io/projected/e1bef462-a213-4a98-bb31-6d6082644da0-kube-api-access-kbstq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.728106 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.728046 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:16.853683 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:16.853654 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22"] May 11 20:56:16.857109 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:16.857079 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1bef462_a213_4a98_bb31_6d6082644da0.slice/crio-d20c72b8ff89c4f3358c42dd692870fe1f35314e574da2ea65d2dbfff639a0df WatchSource:0}: Error finding container d20c72b8ff89c4f3358c42dd692870fe1f35314e574da2ea65d2dbfff639a0df: Status 404 returned error can't find the container with id d20c72b8ff89c4f3358c42dd692870fe1f35314e574da2ea65d2dbfff639a0df May 11 20:56:17.620519 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:17.620491 2567 generic.go:358] "Generic (PLEG): container finished" podID="e1bef462-a213-4a98-bb31-6d6082644da0" containerID="b12c02972940817ed37cc3516ec9d79b7db17d277b12c02d53a8beec46a05f2f" exitCode=0 May 11 20:56:17.620820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:17.620569 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" event={"ID":"e1bef462-a213-4a98-bb31-6d6082644da0","Type":"ContainerDied","Data":"b12c02972940817ed37cc3516ec9d79b7db17d277b12c02d53a8beec46a05f2f"} May 11 20:56:17.620820 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:17.620603 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" event={"ID":"e1bef462-a213-4a98-bb31-6d6082644da0","Type":"ContainerStarted","Data":"d20c72b8ff89c4f3358c42dd692870fe1f35314e574da2ea65d2dbfff639a0df"} May 11 20:56:18.435310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.435240 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk"] May 11 20:56:18.438294 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.438277 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.442127 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.442102 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" May 11 20:56:18.442228 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.442164 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" May 11 20:56:18.442228 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.442175 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" May 11 20:56:18.442393 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.442378 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9zd6v\"" May 11 20:56:18.442447 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.442430 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" May 11 20:56:18.458326 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.458308 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk"] May 11 20:56:18.519302 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.519276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/39680580-0555-4d40-978a-ae556647366e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.519424 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.519332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/39680580-0555-4d40-978a-ae556647366e-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.519424 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.519357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tl6c\" (UniqueName: \"kubernetes.io/projected/39680580-0555-4d40-978a-ae556647366e-kube-api-access-4tl6c\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.620112 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.620080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/39680580-0555-4d40-978a-ae556647366e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.620251 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.620134 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/39680580-0555-4d40-978a-ae556647366e-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.620251 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.620161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tl6c\" (UniqueName: \"kubernetes.io/projected/39680580-0555-4d40-978a-ae556647366e-kube-api-access-4tl6c\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.624476 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.624454 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/39680580-0555-4d40-978a-ae556647366e-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.624837 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.624659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/39680580-0555-4d40-978a-ae556647366e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.627809 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.627786 2567 generic.go:358] "Generic (PLEG): container finished" podID="e1bef462-a213-4a98-bb31-6d6082644da0" containerID="f7e04c9b776d9157b6c8c0ed53a778fb52ead4156692edd84d40d541300a8bf5" exitCode=0 May 11 20:56:18.627917 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.627846 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" event={"ID":"e1bef462-a213-4a98-bb31-6d6082644da0","Type":"ContainerDied","Data":"f7e04c9b776d9157b6c8c0ed53a778fb52ead4156692edd84d40d541300a8bf5"} May 11 20:56:18.638924 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.638902 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tl6c\" (UniqueName: \"kubernetes.io/projected/39680580-0555-4d40-978a-ae556647366e-kube-api-access-4tl6c\") pod \"opendatahub-operator-controller-manager-755c95f69f-7j9jk\" (UID: \"39680580-0555-4d40-978a-ae556647366e\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.747922 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.747730 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:18.880163 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:18.880137 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk"] May 11 20:56:18.882707 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:18.882675 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39680580_0555_4d40_978a_ae556647366e.slice/crio-5c8e4b331b3c9758915b6975b089c9df3dd7f411a5a928ffac2b2289458280e4 WatchSource:0}: Error finding container 5c8e4b331b3c9758915b6975b089c9df3dd7f411a5a928ffac2b2289458280e4: Status 404 returned error can't find the container with id 5c8e4b331b3c9758915b6975b089c9df3dd7f411a5a928ffac2b2289458280e4 May 11 20:56:19.633449 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:19.633404 2567 generic.go:358] "Generic (PLEG): container finished" podID="e1bef462-a213-4a98-bb31-6d6082644da0" containerID="10c2625ac7ba20ef0aa2e20e60d6dc8683bd4153d25b2340c8c63694e72d9613" exitCode=0 May 11 20:56:19.633856 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:19.633503 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" event={"ID":"e1bef462-a213-4a98-bb31-6d6082644da0","Type":"ContainerDied","Data":"10c2625ac7ba20ef0aa2e20e60d6dc8683bd4153d25b2340c8c63694e72d9613"} May 11 20:56:19.635107 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:19.635083 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" event={"ID":"39680580-0555-4d40-978a-ae556647366e","Type":"ContainerStarted","Data":"5c8e4b331b3c9758915b6975b089c9df3dd7f411a5a928ffac2b2289458280e4"} May 11 20:56:21.321067 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.321044 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:21.341893 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.341867 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbstq\" (UniqueName: \"kubernetes.io/projected/e1bef462-a213-4a98-bb31-6d6082644da0-kube-api-access-kbstq\") pod \"e1bef462-a213-4a98-bb31-6d6082644da0\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " May 11 20:56:21.342016 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.341917 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-util\") pod \"e1bef462-a213-4a98-bb31-6d6082644da0\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " May 11 20:56:21.342016 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.341952 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-bundle\") pod \"e1bef462-a213-4a98-bb31-6d6082644da0\" (UID: \"e1bef462-a213-4a98-bb31-6d6082644da0\") " May 11 20:56:21.343169 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.343112 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-bundle" (OuterVolumeSpecName: "bundle") pod "e1bef462-a213-4a98-bb31-6d6082644da0" (UID: "e1bef462-a213-4a98-bb31-6d6082644da0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:21.344443 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.344413 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1bef462-a213-4a98-bb31-6d6082644da0-kube-api-access-kbstq" (OuterVolumeSpecName: "kube-api-access-kbstq") pod "e1bef462-a213-4a98-bb31-6d6082644da0" (UID: "e1bef462-a213-4a98-bb31-6d6082644da0"). InnerVolumeSpecName "kube-api-access-kbstq". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:56:21.351844 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.351822 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-util" (OuterVolumeSpecName: "util") pod "e1bef462-a213-4a98-bb31-6d6082644da0" (UID: "e1bef462-a213-4a98-bb31-6d6082644da0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:21.443168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.443142 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbstq\" (UniqueName: \"kubernetes.io/projected/e1bef462-a213-4a98-bb31-6d6082644da0-kube-api-access-kbstq\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:21.443168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.443164 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:21.443282 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.443175 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e1bef462-a213-4a98-bb31-6d6082644da0-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:21.643338 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.643263 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" event={"ID":"e1bef462-a213-4a98-bb31-6d6082644da0","Type":"ContainerDied","Data":"d20c72b8ff89c4f3358c42dd692870fe1f35314e574da2ea65d2dbfff639a0df"} May 11 20:56:21.643338 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.643298 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d20c72b8ff89c4f3358c42dd692870fe1f35314e574da2ea65d2dbfff639a0df" May 11 20:56:21.643338 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.643276 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cvz22" May 11 20:56:21.644813 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.644783 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" event={"ID":"39680580-0555-4d40-978a-ae556647366e","Type":"ContainerStarted","Data":"4f70d1dfd39ab3cc33223dcbeb83bf014e8bf32c83430565077513a0e1ac8551"} May 11 20:56:21.644949 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.644892 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:21.665930 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:21.665878 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" podStartSLOduration=1.189063565 podStartE2EDuration="3.665866712s" podCreationTimestamp="2026-05-11 20:56:18 +0000 UTC" firstStartedPulling="2026-05-11 20:56:18.884330555 +0000 UTC m=+356.994110978" lastFinishedPulling="2026-05-11 20:56:21.361133699 +0000 UTC m=+359.470914125" observedRunningTime="2026-05-11 20:56:21.664001262 +0000 UTC m=+359.773781706" watchObservedRunningTime="2026-05-11 20:56:21.665866712 +0000 UTC m=+359.775647156" May 11 20:56:32.651827 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:32.651799 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-7j9jk" May 11 20:56:35.052551 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052515 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj"] May 11 20:56:35.052892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052764 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1bef462-a213-4a98-bb31-6d6082644da0" containerName="util" May 11 20:56:35.052892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052775 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1bef462-a213-4a98-bb31-6d6082644da0" containerName="util" May 11 20:56:35.052892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052782 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1bef462-a213-4a98-bb31-6d6082644da0" containerName="pull" May 11 20:56:35.052892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052788 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1bef462-a213-4a98-bb31-6d6082644da0" containerName="pull" May 11 20:56:35.052892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052794 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1bef462-a213-4a98-bb31-6d6082644da0" containerName="extract" May 11 20:56:35.052892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052801 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1bef462-a213-4a98-bb31-6d6082644da0" containerName="extract" May 11 20:56:35.052892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.052853 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1bef462-a213-4a98-bb31-6d6082644da0" containerName="extract" May 11 20:56:35.055507 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.055490 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.058584 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.058563 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" May 11 20:56:35.058682 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.058596 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" May 11 20:56:35.059611 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.059596 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hst2x\"" May 11 20:56:35.070156 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.070136 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj"] May 11 20:56:35.134365 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.134336 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6qhc\" (UniqueName: \"kubernetes.io/projected/7933cd4b-698b-437e-8d8b-0beb056a5068-kube-api-access-r6qhc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.134486 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.134377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.134486 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.134394 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.235654 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.235628 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6qhc\" (UniqueName: \"kubernetes.io/projected/7933cd4b-698b-437e-8d8b-0beb056a5068-kube-api-access-r6qhc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.235746 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.235676 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.235746 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.235718 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.236071 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.236053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.236115 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.236092 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.245507 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.245484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6qhc\" (UniqueName: \"kubernetes.io/projected/7933cd4b-698b-437e-8d8b-0beb056a5068-kube-api-access-r6qhc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.364565 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.364502 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:35.456021 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.455992 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt"] May 11 20:56:35.460173 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.460150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.463513 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.463454 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" May 11 20:56:35.463513 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.463493 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" May 11 20:56:35.463692 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.463574 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-mwf9t\"" May 11 20:56:35.463692 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.463614 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" May 11 20:56:35.464519 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.464503 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" May 11 20:56:35.469112 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.469091 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt"] May 11 20:56:35.496654 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.496641 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj"] May 11 20:56:35.499238 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:35.499212 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7933cd4b_698b_437e_8d8b_0beb056a5068.slice/crio-412a1bd02ec3ec0ff3e7cca0cd7e07b0bbc6da495d95b29773521f0664a34d8a WatchSource:0}: Error finding container 412a1bd02ec3ec0ff3e7cca0cd7e07b0bbc6da495d95b29773521f0664a34d8a: Status 404 returned error can't find the container with id 412a1bd02ec3ec0ff3e7cca0cd7e07b0bbc6da495d95b29773521f0664a34d8a May 11 20:56:35.538430 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.538402 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b64cae5-bcbe-4b04-952c-28a536d1e35b-tmp\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.538524 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.538505 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78s8x\" (UniqueName: \"kubernetes.io/projected/2b64cae5-bcbe-4b04-952c-28a536d1e35b-kube-api-access-78s8x\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.538575 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.538534 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b64cae5-bcbe-4b04-952c-28a536d1e35b-tls-certs\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.639672 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.639608 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b64cae5-bcbe-4b04-952c-28a536d1e35b-tmp\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.639787 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.639677 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78s8x\" (UniqueName: \"kubernetes.io/projected/2b64cae5-bcbe-4b04-952c-28a536d1e35b-kube-api-access-78s8x\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.639787 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.639696 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b64cae5-bcbe-4b04-952c-28a536d1e35b-tls-certs\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.641680 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.641660 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2b64cae5-bcbe-4b04-952c-28a536d1e35b-tmp\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.641830 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.641815 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b64cae5-bcbe-4b04-952c-28a536d1e35b-tls-certs\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.647683 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.647665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78s8x\" (UniqueName: \"kubernetes.io/projected/2b64cae5-bcbe-4b04-952c-28a536d1e35b-kube-api-access-78s8x\") pod \"kube-auth-proxy-c8c9857f9-bjljt\" (UID: \"2b64cae5-bcbe-4b04-952c-28a536d1e35b\") " pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.691021 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.690995 2567 generic.go:358] "Generic (PLEG): container finished" podID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerID="df3ff1a9a1626b0911bcb0df43f581084cfd917b6a6ac2b229a0df8b865a4f5f" exitCode=0 May 11 20:56:35.691122 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.691078 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" event={"ID":"7933cd4b-698b-437e-8d8b-0beb056a5068","Type":"ContainerDied","Data":"df3ff1a9a1626b0911bcb0df43f581084cfd917b6a6ac2b229a0df8b865a4f5f"} May 11 20:56:35.691122 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.691110 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" event={"ID":"7933cd4b-698b-437e-8d8b-0beb056a5068","Type":"ContainerStarted","Data":"412a1bd02ec3ec0ff3e7cca0cd7e07b0bbc6da495d95b29773521f0664a34d8a"} May 11 20:56:35.772684 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.772658 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" May 11 20:56:35.893140 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:35.893120 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt"] May 11 20:56:35.894802 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:35.894776 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b64cae5_bcbe_4b04_952c_28a536d1e35b.slice/crio-61b4ef8ffbc393822196742d6376645093aeab77e2ce6bbcb5d64099abb1b4a1 WatchSource:0}: Error finding container 61b4ef8ffbc393822196742d6376645093aeab77e2ce6bbcb5d64099abb1b4a1: Status 404 returned error can't find the container with id 61b4ef8ffbc393822196742d6376645093aeab77e2ce6bbcb5d64099abb1b4a1 May 11 20:56:36.697530 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:36.697491 2567 generic.go:358] "Generic (PLEG): container finished" podID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerID="c16a25ccde3de7319bddcbf4ca66ccd93932f5bcc84e8bdc8d3e35cbf259c4d3" exitCode=0 May 11 20:56:36.697946 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:36.697572 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" event={"ID":"7933cd4b-698b-437e-8d8b-0beb056a5068","Type":"ContainerDied","Data":"c16a25ccde3de7319bddcbf4ca66ccd93932f5bcc84e8bdc8d3e35cbf259c4d3"} May 11 20:56:36.699884 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:36.699837 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" event={"ID":"2b64cae5-bcbe-4b04-952c-28a536d1e35b","Type":"ContainerStarted","Data":"61b4ef8ffbc393822196742d6376645093aeab77e2ce6bbcb5d64099abb1b4a1"} May 11 20:56:37.706374 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:37.706335 2567 generic.go:358] "Generic (PLEG): container finished" podID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerID="c3fb72d907c4f43e63ffaacdea19aefa46c869a0c5b2787344dd7224eb66fbd4" exitCode=0 May 11 20:56:37.706841 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:37.706398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" event={"ID":"7933cd4b-698b-437e-8d8b-0beb056a5068","Type":"ContainerDied","Data":"c3fb72d907c4f43e63ffaacdea19aefa46c869a0c5b2787344dd7224eb66fbd4"} May 11 20:56:38.410901 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.410858 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7z6xc"] May 11 20:56:38.414330 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.414307 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:38.416694 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.416668 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" May 11 20:56:38.416798 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.416676 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-5kvj7\"" May 11 20:56:38.421764 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.421742 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7z6xc"] May 11 20:56:38.464275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.464242 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e652aca3-bd36-4907-9e16-6be17cde2c16-cert\") pod \"odh-model-controller-858dbf95b8-7z6xc\" (UID: \"e652aca3-bd36-4907-9e16-6be17cde2c16\") " pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:38.464389 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.464289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjlb\" (UniqueName: \"kubernetes.io/projected/e652aca3-bd36-4907-9e16-6be17cde2c16-kube-api-access-ssjlb\") pod \"odh-model-controller-858dbf95b8-7z6xc\" (UID: \"e652aca3-bd36-4907-9e16-6be17cde2c16\") " pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:38.564896 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.564861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e652aca3-bd36-4907-9e16-6be17cde2c16-cert\") pod \"odh-model-controller-858dbf95b8-7z6xc\" (UID: \"e652aca3-bd36-4907-9e16-6be17cde2c16\") " pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:38.565085 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.564914 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjlb\" (UniqueName: \"kubernetes.io/projected/e652aca3-bd36-4907-9e16-6be17cde2c16-kube-api-access-ssjlb\") pod \"odh-model-controller-858dbf95b8-7z6xc\" (UID: \"e652aca3-bd36-4907-9e16-6be17cde2c16\") " pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:38.565085 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:38.565044 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found May 11 20:56:38.565259 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:38.565131 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e652aca3-bd36-4907-9e16-6be17cde2c16-cert podName:e652aca3-bd36-4907-9e16-6be17cde2c16 nodeName:}" failed. No retries permitted until 2026-05-11 20:56:39.065109035 +0000 UTC m=+377.174889461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e652aca3-bd36-4907-9e16-6be17cde2c16-cert") pod "odh-model-controller-858dbf95b8-7z6xc" (UID: "e652aca3-bd36-4907-9e16-6be17cde2c16") : secret "odh-model-controller-webhook-cert" not found May 11 20:56:38.575446 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.575406 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjlb\" (UniqueName: \"kubernetes.io/projected/e652aca3-bd36-4907-9e16-6be17cde2c16-kube-api-access-ssjlb\") pod \"odh-model-controller-858dbf95b8-7z6xc\" (UID: \"e652aca3-bd36-4907-9e16-6be17cde2c16\") " pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:38.926451 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.926429 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:38.967920 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.967876 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-bundle\") pod \"7933cd4b-698b-437e-8d8b-0beb056a5068\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " May 11 20:56:38.968013 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.967947 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-util\") pod \"7933cd4b-698b-437e-8d8b-0beb056a5068\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " May 11 20:56:38.968013 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.968002 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6qhc\" (UniqueName: \"kubernetes.io/projected/7933cd4b-698b-437e-8d8b-0beb056a5068-kube-api-access-r6qhc\") pod \"7933cd4b-698b-437e-8d8b-0beb056a5068\" (UID: \"7933cd4b-698b-437e-8d8b-0beb056a5068\") " May 11 20:56:38.969540 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.969513 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-bundle" (OuterVolumeSpecName: "bundle") pod "7933cd4b-698b-437e-8d8b-0beb056a5068" (UID: "7933cd4b-698b-437e-8d8b-0beb056a5068"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:38.970654 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.970628 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7933cd4b-698b-437e-8d8b-0beb056a5068-kube-api-access-r6qhc" (OuterVolumeSpecName: "kube-api-access-r6qhc") pod "7933cd4b-698b-437e-8d8b-0beb056a5068" (UID: "7933cd4b-698b-437e-8d8b-0beb056a5068"). InnerVolumeSpecName "kube-api-access-r6qhc". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:56:38.977846 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:38.977820 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-util" (OuterVolumeSpecName: "util") pod "7933cd4b-698b-437e-8d8b-0beb056a5068" (UID: "7933cd4b-698b-437e-8d8b-0beb056a5068"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:39.069237 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.069210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e652aca3-bd36-4907-9e16-6be17cde2c16-cert\") pod \"odh-model-controller-858dbf95b8-7z6xc\" (UID: \"e652aca3-bd36-4907-9e16-6be17cde2c16\") " pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:39.069350 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.069252 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:39.069350 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.069262 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6qhc\" (UniqueName: \"kubernetes.io/projected/7933cd4b-698b-437e-8d8b-0beb056a5068-kube-api-access-r6qhc\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:39.069350 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.069271 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7933cd4b-698b-437e-8d8b-0beb056a5068-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:39.071367 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.071347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e652aca3-bd36-4907-9e16-6be17cde2c16-cert\") pod \"odh-model-controller-858dbf95b8-7z6xc\" (UID: \"e652aca3-bd36-4907-9e16-6be17cde2c16\") " pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:39.326717 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.326690 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:39.460844 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.460819 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7z6xc"] May 11 20:56:39.462377 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:39.462353 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode652aca3_bd36_4907_9e16_6be17cde2c16.slice/crio-89e3d617e1662d3348f95c8024cba94b6dd549e36c739f2f33d3cea780eb3690 WatchSource:0}: Error finding container 89e3d617e1662d3348f95c8024cba94b6dd549e36c739f2f33d3cea780eb3690: Status 404 returned error can't find the container with id 89e3d617e1662d3348f95c8024cba94b6dd549e36c739f2f33d3cea780eb3690 May 11 20:56:39.714259 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.714185 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" event={"ID":"7933cd4b-698b-437e-8d8b-0beb056a5068","Type":"ContainerDied","Data":"412a1bd02ec3ec0ff3e7cca0cd7e07b0bbc6da495d95b29773521f0664a34d8a"} May 11 20:56:39.714259 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.714219 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="412a1bd02ec3ec0ff3e7cca0cd7e07b0bbc6da495d95b29773521f0664a34d8a" May 11 20:56:39.714259 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.714224 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zsj" May 11 20:56:39.715297 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.715265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" event={"ID":"e652aca3-bd36-4907-9e16-6be17cde2c16","Type":"ContainerStarted","Data":"89e3d617e1662d3348f95c8024cba94b6dd549e36c739f2f33d3cea780eb3690"} May 11 20:56:39.716552 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.716527 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" event={"ID":"2b64cae5-bcbe-4b04-952c-28a536d1e35b","Type":"ContainerStarted","Data":"6d1cdff5e47f8e54953300fc9245042e742fb865b77aae52d490963562fa1127"} May 11 20:56:39.733080 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:39.733045 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-c8c9857f9-bjljt" podStartSLOduration=1.6605629880000001 podStartE2EDuration="4.733032906s" podCreationTimestamp="2026-05-11 20:56:35 +0000 UTC" firstStartedPulling="2026-05-11 20:56:35.89661662 +0000 UTC m=+374.006397044" lastFinishedPulling="2026-05-11 20:56:38.969086527 +0000 UTC m=+377.078866962" observedRunningTime="2026-05-11 20:56:39.731321747 +0000 UTC m=+377.841102192" watchObservedRunningTime="2026-05-11 20:56:39.733032906 +0000 UTC m=+377.842813350" May 11 20:56:42.727949 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:42.727913 2567 generic.go:358] "Generic (PLEG): container finished" podID="e652aca3-bd36-4907-9e16-6be17cde2c16" containerID="f360c905edf121933ba381c5dc7ed19293b81808022801816bf7054f7b8e1c19" exitCode=1 May 11 20:56:42.728355 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:42.728001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" event={"ID":"e652aca3-bd36-4907-9e16-6be17cde2c16","Type":"ContainerDied","Data":"f360c905edf121933ba381c5dc7ed19293b81808022801816bf7054f7b8e1c19"} May 11 20:56:42.728355 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:42.728179 2567 scope.go:117] "RemoveContainer" containerID="f360c905edf121933ba381c5dc7ed19293b81808022801816bf7054f7b8e1c19" May 11 20:56:43.732771 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:43.732734 2567 generic.go:358] "Generic (PLEG): container finished" podID="e652aca3-bd36-4907-9e16-6be17cde2c16" containerID="1a960fbc77d0be13f1423b2318135206122994daf877fabf42528a5a7d9c6f50" exitCode=1 May 11 20:56:43.733221 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:43.732792 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" event={"ID":"e652aca3-bd36-4907-9e16-6be17cde2c16","Type":"ContainerDied","Data":"1a960fbc77d0be13f1423b2318135206122994daf877fabf42528a5a7d9c6f50"} May 11 20:56:43.733221 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:43.732831 2567 scope.go:117] "RemoveContainer" containerID="f360c905edf121933ba381c5dc7ed19293b81808022801816bf7054f7b8e1c19" May 11 20:56:43.733221 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:43.733054 2567 scope.go:117] "RemoveContainer" containerID="1a960fbc77d0be13f1423b2318135206122994daf877fabf42528a5a7d9c6f50" May 11 20:56:43.733351 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:43.733235 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7z6xc_opendatahub(e652aca3-bd36-4907-9e16-6be17cde2c16)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" podUID="e652aca3-bd36-4907-9e16-6be17cde2c16" May 11 20:56:44.357761 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.357728 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm"] May 11 20:56:44.358032 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.358019 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerName="util" May 11 20:56:44.358088 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.358034 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerName="util" May 11 20:56:44.358088 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.358048 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerName="pull" May 11 20:56:44.358088 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.358053 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerName="pull" May 11 20:56:44.358088 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.358067 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerName="extract" May 11 20:56:44.358088 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.358075 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerName="extract" May 11 20:56:44.358251 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.358138 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7933cd4b-698b-437e-8d8b-0beb056a5068" containerName="extract" May 11 20:56:44.362395 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.362374 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.374211 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.374193 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" May 11 20:56:44.374314 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.374191 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" May 11 20:56:44.375202 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.375188 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hst2x\"" May 11 20:56:44.382374 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.382353 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm"] May 11 20:56:44.414083 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.414054 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.414211 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.414149 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5kbh\" (UniqueName: \"kubernetes.io/projected/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-kube-api-access-q5kbh\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.414211 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.414180 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.515168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.515131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5kbh\" (UniqueName: \"kubernetes.io/projected/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-kube-api-access-q5kbh\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.515168 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.515166 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.515362 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.515210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.515518 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.515503 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.515581 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.515564 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.528515 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.528489 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5kbh\" (UniqueName: \"kubernetes.io/projected/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-kube-api-access-q5kbh\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.664644 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.664551 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-nq9kg"] May 11 20:56:44.668654 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.668620 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:44.671299 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.671277 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:44.671448 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.671431 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-45sdp\"" May 11 20:56:44.672275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.671933 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" May 11 20:56:44.685319 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.685297 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-nq9kg"] May 11 20:56:44.716412 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.716376 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:44.716599 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.716571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zv7r\" (UniqueName: \"kubernetes.io/projected/ad358e12-1a73-4326-9235-915bfb8847bf-kube-api-access-7zv7r\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:44.740513 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.740441 2567 scope.go:117] "RemoveContainer" containerID="1a960fbc77d0be13f1423b2318135206122994daf877fabf42528a5a7d9c6f50" May 11 20:56:44.740981 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:44.740678 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7z6xc_opendatahub(e652aca3-bd36-4907-9e16-6be17cde2c16)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" podUID="e652aca3-bd36-4907-9e16-6be17cde2c16" May 11 20:56:44.817246 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.817218 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zv7r\" (UniqueName: \"kubernetes.io/projected/ad358e12-1a73-4326-9235-915bfb8847bf-kube-api-access-7zv7r\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:44.817384 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.817290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:44.817426 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:44.817416 2567 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found May 11 20:56:44.817497 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:44.817486 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert podName:ad358e12-1a73-4326-9235-915bfb8847bf nodeName:}" failed. No retries permitted until 2026-05-11 20:56:45.317466343 +0000 UTC m=+383.427246769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert") pod "kserve-controller-manager-856948b99f-nq9kg" (UID: "ad358e12-1a73-4326-9235-915bfb8847bf") : secret "kserve-webhook-server-cert" not found May 11 20:56:44.823990 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.823954 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm"] May 11 20:56:44.824879 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:44.824854 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb18c0d03_fae3_48a2_a57d_b9dfc2f50983.slice/crio-52f1d0d67293e4523475012f2305388265723e8066064b62696d389e027041cb WatchSource:0}: Error finding container 52f1d0d67293e4523475012f2305388265723e8066064b62696d389e027041cb: Status 404 returned error can't find the container with id 52f1d0d67293e4523475012f2305388265723e8066064b62696d389e027041cb May 11 20:56:44.830592 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:44.830558 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zv7r\" (UniqueName: \"kubernetes.io/projected/ad358e12-1a73-4326-9235-915bfb8847bf-kube-api-access-7zv7r\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:45.322114 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:45.322084 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:45.322269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:45.322202 2567 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found May 11 20:56:45.322269 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:45.322259 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert podName:ad358e12-1a73-4326-9235-915bfb8847bf nodeName:}" failed. No retries permitted until 2026-05-11 20:56:46.322244409 +0000 UTC m=+384.432024833 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert") pod "kserve-controller-manager-856948b99f-nq9kg" (UID: "ad358e12-1a73-4326-9235-915bfb8847bf") : secret "kserve-webhook-server-cert" not found May 11 20:56:45.744808 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:45.744734 2567 generic.go:358] "Generic (PLEG): container finished" podID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerID="a2b07d607094f51af84b9167e61fb1f15355027f150d12b0af3a6722c92fa320" exitCode=0 May 11 20:56:45.745242 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:45.744815 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" event={"ID":"b18c0d03-fae3-48a2-a57d-b9dfc2f50983","Type":"ContainerDied","Data":"a2b07d607094f51af84b9167e61fb1f15355027f150d12b0af3a6722c92fa320"} May 11 20:56:45.745242 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:45.744847 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" event={"ID":"b18c0d03-fae3-48a2-a57d-b9dfc2f50983","Type":"ContainerStarted","Data":"52f1d0d67293e4523475012f2305388265723e8066064b62696d389e027041cb"} May 11 20:56:46.240944 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.240914 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx"] May 11 20:56:46.243870 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.243854 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.246840 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.246820 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" May 11 20:56:46.246987 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.246823 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" May 11 20:56:46.246987 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.246823 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-gv5h7\"" May 11 20:56:46.256409 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.256388 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx"] May 11 20:56:46.329873 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.329846 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9cce888a-3868-4a42-aa9d-c59259e11a77-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mf5hx\" (UID: \"9cce888a-3868-4a42-aa9d-c59259e11a77\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.329985 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.329899 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8hq9\" (UniqueName: \"kubernetes.io/projected/9cce888a-3868-4a42-aa9d-c59259e11a77-kube-api-access-f8hq9\") pod \"servicemesh-operator3-55f49c5f94-mf5hx\" (UID: \"9cce888a-3868-4a42-aa9d-c59259e11a77\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.330042 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.329991 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:46.332230 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.332203 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad358e12-1a73-4326-9235-915bfb8847bf-cert\") pod \"kserve-controller-manager-856948b99f-nq9kg\" (UID: \"ad358e12-1a73-4326-9235-915bfb8847bf\") " pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:46.430800 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.430775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9cce888a-3868-4a42-aa9d-c59259e11a77-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mf5hx\" (UID: \"9cce888a-3868-4a42-aa9d-c59259e11a77\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.430892 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.430816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8hq9\" (UniqueName: \"kubernetes.io/projected/9cce888a-3868-4a42-aa9d-c59259e11a77-kube-api-access-f8hq9\") pod \"servicemesh-operator3-55f49c5f94-mf5hx\" (UID: \"9cce888a-3868-4a42-aa9d-c59259e11a77\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.433230 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.433207 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9cce888a-3868-4a42-aa9d-c59259e11a77-operator-config\") pod \"servicemesh-operator3-55f49c5f94-mf5hx\" (UID: \"9cce888a-3868-4a42-aa9d-c59259e11a77\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.451074 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.451042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8hq9\" (UniqueName: \"kubernetes.io/projected/9cce888a-3868-4a42-aa9d-c59259e11a77-kube-api-access-f8hq9\") pod \"servicemesh-operator3-55f49c5f94-mf5hx\" (UID: \"9cce888a-3868-4a42-aa9d-c59259e11a77\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.497389 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.497332 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:46.552513 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.552491 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:46.668806 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.668776 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-nq9kg"] May 11 20:56:46.681880 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:46.681847 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad358e12_1a73_4326_9235_915bfb8847bf.slice/crio-97bfd0b1326e2a214a7a9371754c1a5577896c57a53fe5cf53231bf0b5341639 WatchSource:0}: Error finding container 97bfd0b1326e2a214a7a9371754c1a5577896c57a53fe5cf53231bf0b5341639: Status 404 returned error can't find the container with id 97bfd0b1326e2a214a7a9371754c1a5577896c57a53fe5cf53231bf0b5341639 May 11 20:56:46.699213 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.699193 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx"] May 11 20:56:46.702804 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:46.702782 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cce888a_3868_4a42_aa9d_c59259e11a77.slice/crio-caa536585aaa1c09ac0a631cf7c2affd7dc4f449d5a56286580ac6ea31534437 WatchSource:0}: Error finding container caa536585aaa1c09ac0a631cf7c2affd7dc4f449d5a56286580ac6ea31534437: Status 404 returned error can't find the container with id caa536585aaa1c09ac0a631cf7c2affd7dc4f449d5a56286580ac6ea31534437 May 11 20:56:46.749379 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.749321 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" event={"ID":"ad358e12-1a73-4326-9235-915bfb8847bf","Type":"ContainerStarted","Data":"97bfd0b1326e2a214a7a9371754c1a5577896c57a53fe5cf53231bf0b5341639"} May 11 20:56:46.750583 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.750562 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" event={"ID":"9cce888a-3868-4a42-aa9d-c59259e11a77","Type":"ContainerStarted","Data":"caa536585aaa1c09ac0a631cf7c2affd7dc4f449d5a56286580ac6ea31534437"} May 11 20:56:46.751968 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.751937 2567 generic.go:358] "Generic (PLEG): container finished" podID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerID="d52ac9fcb968b6f63d70e15f87b1229db15ba781fc81932821de5dd086264b89" exitCode=0 May 11 20:56:46.752058 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:46.752008 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" event={"ID":"b18c0d03-fae3-48a2-a57d-b9dfc2f50983","Type":"ContainerDied","Data":"d52ac9fcb968b6f63d70e15f87b1229db15ba781fc81932821de5dd086264b89"} May 11 20:56:47.757761 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:47.757723 2567 generic.go:358] "Generic (PLEG): container finished" podID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerID="ba5a9fb108bfa3e9e08d8aaafc345aa2fa21383160d67c9bc8b013fc04dcc6bc" exitCode=0 May 11 20:56:47.758203 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:47.757777 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" event={"ID":"b18c0d03-fae3-48a2-a57d-b9dfc2f50983","Type":"ContainerDied","Data":"ba5a9fb108bfa3e9e08d8aaafc345aa2fa21383160d67c9bc8b013fc04dcc6bc"} May 11 20:56:49.141873 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.141851 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:49.254012 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.253978 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-util\") pod \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " May 11 20:56:49.254191 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.254040 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5kbh\" (UniqueName: \"kubernetes.io/projected/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-kube-api-access-q5kbh\") pod \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " May 11 20:56:49.254191 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.254089 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-bundle\") pod \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\" (UID: \"b18c0d03-fae3-48a2-a57d-b9dfc2f50983\") " May 11 20:56:49.255051 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.255019 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-bundle" (OuterVolumeSpecName: "bundle") pod "b18c0d03-fae3-48a2-a57d-b9dfc2f50983" (UID: "b18c0d03-fae3-48a2-a57d-b9dfc2f50983"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:49.256349 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.256324 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-kube-api-access-q5kbh" (OuterVolumeSpecName: "kube-api-access-q5kbh") pod "b18c0d03-fae3-48a2-a57d-b9dfc2f50983" (UID: "b18c0d03-fae3-48a2-a57d-b9dfc2f50983"). InnerVolumeSpecName "kube-api-access-q5kbh". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:56:49.260580 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.260544 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-util" (OuterVolumeSpecName: "util") pod "b18c0d03-fae3-48a2-a57d-b9dfc2f50983" (UID: "b18c0d03-fae3-48a2-a57d-b9dfc2f50983"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:56:49.327694 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.327661 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:49.328122 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.328107 2567 scope.go:117] "RemoveContainer" containerID="1a960fbc77d0be13f1423b2318135206122994daf877fabf42528a5a7d9c6f50" May 11 20:56:49.328334 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:56:49.328316 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-7z6xc_opendatahub(e652aca3-bd36-4907-9e16-6be17cde2c16)\"" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" podUID="e652aca3-bd36-4907-9e16-6be17cde2c16" May 11 20:56:49.355596 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.355573 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:49.355700 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.355601 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5kbh\" (UniqueName: \"kubernetes.io/projected/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-kube-api-access-q5kbh\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:49.355700 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.355645 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b18c0d03-fae3-48a2-a57d-b9dfc2f50983-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:56:49.768056 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.768012 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" event={"ID":"b18c0d03-fae3-48a2-a57d-b9dfc2f50983","Type":"ContainerDied","Data":"52f1d0d67293e4523475012f2305388265723e8066064b62696d389e027041cb"} May 11 20:56:49.768056 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.768061 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52f1d0d67293e4523475012f2305388265723e8066064b62696d389e027041cb" May 11 20:56:49.768272 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:49.768029 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebfxssm" May 11 20:56:50.772707 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:50.772674 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" event={"ID":"ad358e12-1a73-4326-9235-915bfb8847bf","Type":"ContainerStarted","Data":"e0456a66d3462b39a4e01639b42c0ca8626ce768faff6c03930e19d62ab5927b"} May 11 20:56:50.773171 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:50.772806 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:56:50.774207 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:50.774185 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" event={"ID":"9cce888a-3868-4a42-aa9d-c59259e11a77","Type":"ContainerStarted","Data":"5ffd2ba7635dbfda5894494db9bd78fe24fdf2bfedf1dd72a3529cba73481e6c"} May 11 20:56:50.774330 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:50.774274 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:56:50.789738 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:50.789695 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" podStartSLOduration=3.372380254 podStartE2EDuration="6.789682488s" podCreationTimestamp="2026-05-11 20:56:44 +0000 UTC" firstStartedPulling="2026-05-11 20:56:46.683437379 +0000 UTC m=+384.793217802" lastFinishedPulling="2026-05-11 20:56:50.10073961 +0000 UTC m=+388.210520036" observedRunningTime="2026-05-11 20:56:50.788471991 +0000 UTC m=+388.898252448" watchObservedRunningTime="2026-05-11 20:56:50.789682488 +0000 UTC m=+388.899462933" May 11 20:56:50.807081 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:50.807037 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" podStartSLOduration=1.407951998 podStartE2EDuration="4.80702353s" podCreationTimestamp="2026-05-11 20:56:46 +0000 UTC" firstStartedPulling="2026-05-11 20:56:46.705131225 +0000 UTC m=+384.814911647" lastFinishedPulling="2026-05-11 20:56:50.104202756 +0000 UTC m=+388.213983179" observedRunningTime="2026-05-11 20:56:50.804174686 +0000 UTC m=+388.913955131" watchObservedRunningTime="2026-05-11 20:56:50.80702353 +0000 UTC m=+388.916803976" May 11 20:56:53.773445 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.773406 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v"] May 11 20:56:53.773888 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.773856 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerName="util" May 11 20:56:53.773888 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.773873 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerName="util" May 11 20:56:53.774075 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.773902 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerName="extract" May 11 20:56:53.774075 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.773911 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerName="extract" May 11 20:56:53.774075 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.773928 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerName="pull" May 11 20:56:53.774075 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.773936 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerName="pull" May 11 20:56:53.774075 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.774037 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b18c0d03-fae3-48a2-a57d-b9dfc2f50983" containerName="extract" May 11 20:56:53.777271 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.777252 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.779883 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.779863 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" May 11 20:56:53.779883 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.779872 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" May 11 20:56:53.779883 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.779878 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-gw-ca-root-cert\"" May 11 20:56:53.780176 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.780154 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" May 11 20:56:53.780250 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.780212 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-746vw\"" May 11 20:56:53.792291 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.792267 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v"] May 11 20:56:53.887907 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.887881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.888058 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.887917 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-kubeconfig\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.888058 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.887992 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-cacerts\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.888058 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.888024 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.888058 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.888049 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-token\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.888256 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.888086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e4b31fbc-0442-401c-b32b-ba68a2bf000b-local-certs\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.888256 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.888105 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m445h\" (UniqueName: \"kubernetes.io/projected/e4b31fbc-0442-401c-b32b-ba68a2bf000b-kube-api-access-m445h\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.988917 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.988885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-cacerts\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.988917 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.988916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.989150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.988939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-token\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.989150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.988995 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e4b31fbc-0442-401c-b32b-ba68a2bf000b-local-certs\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.989150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.989022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m445h\" (UniqueName: \"kubernetes.io/projected/e4b31fbc-0442-401c-b32b-ba68a2bf000b-kube-api-access-m445h\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.989150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.989061 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.989150 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.989100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-kubeconfig\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.989774 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.989745 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.991410 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.991388 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-cacerts\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.991616 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.991593 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e4b31fbc-0442-401c-b32b-ba68a2bf000b-local-certs\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.991773 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.991757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-kubeconfig\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.991873 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.991851 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.998314 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.998291 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e4b31fbc-0442-401c-b32b-ba68a2bf000b-istio-token\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:53.998488 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:53.998472 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m445h\" (UniqueName: \"kubernetes.io/projected/e4b31fbc-0442-401c-b32b-ba68a2bf000b-kube-api-access-m445h\") pod \"istiod-openshift-gateway-798958bb55-2c69v\" (UID: \"e4b31fbc-0442-401c-b32b-ba68a2bf000b\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:54.087136 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:54.087112 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:54.220615 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:54.220584 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v"] May 11 20:56:54.226851 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:56:54.226808 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b31fbc_0442_401c_b32b_ba68a2bf000b.slice/crio-34af9a128e62602ea2df0f9a028c2c62702c7f6ff505965a2440978c146a6ab2 WatchSource:0}: Error finding container 34af9a128e62602ea2df0f9a028c2c62702c7f6ff505965a2440978c146a6ab2: Status 404 returned error can't find the container with id 34af9a128e62602ea2df0f9a028c2c62702c7f6ff505965a2440978c146a6ab2 May 11 20:56:54.790016 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:54.789977 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" event={"ID":"e4b31fbc-0442-401c-b32b-ba68a2bf000b","Type":"ContainerStarted","Data":"34af9a128e62602ea2df0f9a028c2c62702c7f6ff505965a2440978c146a6ab2"} May 11 20:56:56.837311 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:56.837275 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} May 11 20:56:56.837521 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:56.837343 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} May 11 20:56:57.803846 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:57.803798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" event={"ID":"e4b31fbc-0442-401c-b32b-ba68a2bf000b","Type":"ContainerStarted","Data":"f390ed064e384595df927be3b32a7c8b9e28c1fa92f97427c8034ad3b9da8636"} May 11 20:56:57.804096 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:57.804077 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:57.806310 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:57.806284 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-798958bb55-2c69v container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= May 11 20:56:57.806559 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:57.806505 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" podUID="e4b31fbc-0442-401c-b32b-ba68a2bf000b" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" May 11 20:56:57.831349 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:57.831293 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" podStartSLOduration=2.222880021 podStartE2EDuration="4.831275929s" podCreationTimestamp="2026-05-11 20:56:53 +0000 UTC" firstStartedPulling="2026-05-11 20:56:54.22866598 +0000 UTC m=+392.338446402" lastFinishedPulling="2026-05-11 20:56:56.837061883 +0000 UTC m=+394.946842310" observedRunningTime="2026-05-11 20:56:57.828598723 +0000 UTC m=+395.938379168" watchObservedRunningTime="2026-05-11 20:56:57.831275929 +0000 UTC m=+395.941056371" May 11 20:56:58.807848 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:58.807813 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-2c69v" May 11 20:56:59.327152 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:59.327121 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:56:59.327828 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:59.327808 2567 scope.go:117] "RemoveContainer" containerID="1a960fbc77d0be13f1423b2318135206122994daf877fabf42528a5a7d9c6f50" May 11 20:56:59.812877 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:59.812836 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" event={"ID":"e652aca3-bd36-4907-9e16-6be17cde2c16","Type":"ContainerStarted","Data":"cbf38977822232c0f1a6ebd10f69c704139e62fd3b28cbb53eeb596c854fb3b2"} May 11 20:56:59.832237 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:56:59.832148 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" podStartSLOduration=1.702516865 podStartE2EDuration="21.832134796s" podCreationTimestamp="2026-05-11 20:56:38 +0000 UTC" firstStartedPulling="2026-05-11 20:56:39.463693733 +0000 UTC m=+377.573474156" lastFinishedPulling="2026-05-11 20:56:59.593311664 +0000 UTC m=+397.703092087" observedRunningTime="2026-05-11 20:56:59.830475174 +0000 UTC m=+397.940255621" watchObservedRunningTime="2026-05-11 20:56:59.832134796 +0000 UTC m=+397.941915241" May 11 20:57:01.779804 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:01.779762 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-mf5hx" May 11 20:57:09.813531 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:09.813497 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:57:09.815228 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:09.815207 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-7z6xc" May 11 20:57:21.782594 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:21.782560 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-nq9kg" May 11 20:57:36.546921 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.546848 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fbhxs"] May 11 20:57:36.549871 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.549855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:36.552308 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.552285 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" May 11 20:57:36.552427 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.552399 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-b8n89\"" May 11 20:57:36.552485 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.552432 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" May 11 20:57:36.557742 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.557718 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fbhxs"] May 11 20:57:36.616667 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.616640 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzfbc\" (UniqueName: \"kubernetes.io/projected/8c3c8e91-d4a1-4e93-a78d-1b52efbc3163-kube-api-access-fzfbc\") pod \"kuadrant-operator-catalog-fbhxs\" (UID: \"8c3c8e91-d4a1-4e93-a78d-1b52efbc3163\") " pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:36.717024 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.716996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzfbc\" (UniqueName: \"kubernetes.io/projected/8c3c8e91-d4a1-4e93-a78d-1b52efbc3163-kube-api-access-fzfbc\") pod \"kuadrant-operator-catalog-fbhxs\" (UID: \"8c3c8e91-d4a1-4e93-a78d-1b52efbc3163\") " pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:36.725410 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.725381 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzfbc\" (UniqueName: \"kubernetes.io/projected/8c3c8e91-d4a1-4e93-a78d-1b52efbc3163-kube-api-access-fzfbc\") pod \"kuadrant-operator-catalog-fbhxs\" (UID: \"8c3c8e91-d4a1-4e93-a78d-1b52efbc3163\") " pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:36.859721 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.859654 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:36.988045 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:36.988001 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-fbhxs"] May 11 20:57:36.991736 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:57:36.991706 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c3c8e91_d4a1_4e93_a78d_1b52efbc3163.slice/crio-1371ac6ba1e79e510cde5750badb155b2bc5aa26c718aa66351e85ce020a62c2 WatchSource:0}: Error finding container 1371ac6ba1e79e510cde5750badb155b2bc5aa26c718aa66351e85ce020a62c2: Status 404 returned error can't find the container with id 1371ac6ba1e79e510cde5750badb155b2bc5aa26c718aa66351e85ce020a62c2 May 11 20:57:37.949387 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:37.949347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" event={"ID":"8c3c8e91-d4a1-4e93-a78d-1b52efbc3163","Type":"ContainerStarted","Data":"1371ac6ba1e79e510cde5750badb155b2bc5aa26c718aa66351e85ce020a62c2"} May 11 20:57:39.956781 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:39.956748 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" event={"ID":"8c3c8e91-d4a1-4e93-a78d-1b52efbc3163","Type":"ContainerStarted","Data":"579adbe7b6de00be186991c7a0f6d241da142889eae9537085419d4efae4cd03"} May 11 20:57:39.972655 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:39.972613 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" podStartSLOduration=1.991717266 podStartE2EDuration="3.972599426s" podCreationTimestamp="2026-05-11 20:57:36 +0000 UTC" firstStartedPulling="2026-05-11 20:57:36.992886695 +0000 UTC m=+435.102667118" lastFinishedPulling="2026-05-11 20:57:38.973768837 +0000 UTC m=+437.083549278" observedRunningTime="2026-05-11 20:57:39.971132288 +0000 UTC m=+438.080912755" watchObservedRunningTime="2026-05-11 20:57:39.972599426 +0000 UTC m=+438.082379871" May 11 20:57:46.860358 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:46.860322 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:46.860744 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:46.860368 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:46.882190 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:46.882166 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:47.001347 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:47.001321 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-fbhxs" May 11 20:57:48.580888 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.580850 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9"] May 11 20:57:48.584584 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.584564 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.587018 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.586998 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8ksxs\"" May 11 20:57:48.591747 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.591725 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9"] May 11 20:57:48.601793 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.601769 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.601919 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.601846 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rd22\" (UniqueName: \"kubernetes.io/projected/6301480a-ae94-4125-9ba1-97b9d055b32c-kube-api-access-7rd22\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.602002 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.601949 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.703222 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.703189 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.703377 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.703268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.703377 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.703308 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rd22\" (UniqueName: \"kubernetes.io/projected/6301480a-ae94-4125-9ba1-97b9d055b32c-kube-api-access-7rd22\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.703523 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.703504 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.703597 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.703576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.711889 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.711864 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rd22\" (UniqueName: \"kubernetes.io/projected/6301480a-ae94-4125-9ba1-97b9d055b32c-kube-api-access-7rd22\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:48.894271 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:48.894203 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:49.024160 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.024131 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9"] May 11 20:57:49.027395 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:57:49.027366 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6301480a_ae94_4125_9ba1_97b9d055b32c.slice/crio-32c5eb678500f435b3c372d1999e91564380de75fcbcc734456e0b1c90c222c0 WatchSource:0}: Error finding container 32c5eb678500f435b3c372d1999e91564380de75fcbcc734456e0b1c90c222c0: Status 404 returned error can't find the container with id 32c5eb678500f435b3c372d1999e91564380de75fcbcc734456e0b1c90c222c0 May 11 20:57:49.178536 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.178454 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf"] May 11 20:57:49.181664 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.181648 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.189713 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.189687 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf"] May 11 20:57:49.208308 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.208287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.208414 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.208322 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.208414 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.208362 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmdh6\" (UniqueName: \"kubernetes.io/projected/53246992-50b7-4b5a-96c4-a77728a0a4c9-kube-api-access-pmdh6\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.309227 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.309201 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.309331 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.309238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.309331 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.309272 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmdh6\" (UniqueName: \"kubernetes.io/projected/53246992-50b7-4b5a-96c4-a77728a0a4c9-kube-api-access-pmdh6\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.309569 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.309552 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.309612 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.309580 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.317805 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.317784 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmdh6\" (UniqueName: \"kubernetes.io/projected/53246992-50b7-4b5a-96c4-a77728a0a4c9-kube-api-access-pmdh6\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.491227 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.491159 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:49.624367 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.624344 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf"] May 11 20:57:49.625904 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:57:49.625875 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53246992_50b7_4b5a_96c4_a77728a0a4c9.slice/crio-b00b90ccb41eb2e7a1eccfff4937ad9db96ba4b8bd853fe5de8f8823147c0230 WatchSource:0}: Error finding container b00b90ccb41eb2e7a1eccfff4937ad9db96ba4b8bd853fe5de8f8823147c0230: Status 404 returned error can't find the container with id b00b90ccb41eb2e7a1eccfff4937ad9db96ba4b8bd853fe5de8f8823147c0230 May 11 20:57:49.776984 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.776901 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw"] May 11 20:57:49.780418 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.780400 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.793725 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.793693 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw"] May 11 20:57:49.813442 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.813419 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.813554 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.813462 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.813554 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.813527 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkd2b\" (UniqueName: \"kubernetes.io/projected/0422a798-ac83-4b60-bfee-01964e487d82-kube-api-access-fkd2b\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.914338 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.914313 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.914476 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.914362 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.914476 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.914400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkd2b\" (UniqueName: \"kubernetes.io/projected/0422a798-ac83-4b60-bfee-01964e487d82-kube-api-access-fkd2b\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.914780 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.914760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.914847 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.914775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.922355 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.922333 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkd2b\" (UniqueName: \"kubernetes.io/projected/0422a798-ac83-4b60-bfee-01964e487d82-kube-api-access-fkd2b\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:49.993263 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.993229 2567 generic.go:358] "Generic (PLEG): container finished" podID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerID="bb22f8fb82aff570febe5b2dc4468a244236baa067a22e1ec54aab7741e2090f" exitCode=0 May 11 20:57:49.993410 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.993295 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" event={"ID":"53246992-50b7-4b5a-96c4-a77728a0a4c9","Type":"ContainerDied","Data":"bb22f8fb82aff570febe5b2dc4468a244236baa067a22e1ec54aab7741e2090f"} May 11 20:57:49.993410 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.993324 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" event={"ID":"53246992-50b7-4b5a-96c4-a77728a0a4c9","Type":"ContainerStarted","Data":"b00b90ccb41eb2e7a1eccfff4937ad9db96ba4b8bd853fe5de8f8823147c0230"} May 11 20:57:49.994733 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.994708 2567 generic.go:358] "Generic (PLEG): container finished" podID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerID="d82774e581163e92af1034cc5d8d4c9bd8dd243e02a819f5da6160d24fdb6d78" exitCode=0 May 11 20:57:49.994830 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.994790 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" event={"ID":"6301480a-ae94-4125-9ba1-97b9d055b32c","Type":"ContainerDied","Data":"d82774e581163e92af1034cc5d8d4c9bd8dd243e02a819f5da6160d24fdb6d78"} May 11 20:57:49.994830 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:49.994823 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" event={"ID":"6301480a-ae94-4125-9ba1-97b9d055b32c","Type":"ContainerStarted","Data":"32c5eb678500f435b3c372d1999e91564380de75fcbcc734456e0b1c90c222c0"} May 11 20:57:50.091579 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.091538 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:50.217944 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.217918 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw"] May 11 20:57:50.219455 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:57:50.219424 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0422a798_ac83_4b60_bfee_01964e487d82.slice/crio-956741bc71b4fba9332b251c506a5af05ef86ca37368f3e89a968613a82e8eca WatchSource:0}: Error finding container 956741bc71b4fba9332b251c506a5af05ef86ca37368f3e89a968613a82e8eca: Status 404 returned error can't find the container with id 956741bc71b4fba9332b251c506a5af05ef86ca37368f3e89a968613a82e8eca May 11 20:57:50.383650 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.383588 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd"] May 11 20:57:50.387051 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.387035 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.395167 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.395145 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd"] May 11 20:57:50.418457 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.418434 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.418554 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.418468 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxq9\" (UniqueName: \"kubernetes.io/projected/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-kube-api-access-tmxq9\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.418554 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.418489 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.519671 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.519642 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.519798 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.519693 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxq9\" (UniqueName: \"kubernetes.io/projected/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-kube-api-access-tmxq9\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.519798 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.519716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.519970 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.519941 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.520026 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.519988 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.528015 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.527988 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxq9\" (UniqueName: \"kubernetes.io/projected/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-kube-api-access-tmxq9\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.696989 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.696952 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:50.846125 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:50.846102 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd"] May 11 20:57:50.847176 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:57:50.847155 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b3c4ebb_0b4d_4b16_9a54_befadf67cfa0.slice/crio-1144e798e11bb3f4097e1eff7ca67dd8c344d868570f143b0307813ff9f516fb WatchSource:0}: Error finding container 1144e798e11bb3f4097e1eff7ca67dd8c344d868570f143b0307813ff9f516fb: Status 404 returned error can't find the container with id 1144e798e11bb3f4097e1eff7ca67dd8c344d868570f143b0307813ff9f516fb May 11 20:57:51.000819 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.000761 2567 generic.go:358] "Generic (PLEG): container finished" podID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerID="1e27d13c289b8a713b59b162b35e2a5e03a302b11faf5cad8af2eb71127ad107" exitCode=0 May 11 20:57:51.000924 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.000851 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" event={"ID":"53246992-50b7-4b5a-96c4-a77728a0a4c9","Type":"ContainerDied","Data":"1e27d13c289b8a713b59b162b35e2a5e03a302b11faf5cad8af2eb71127ad107"} May 11 20:57:51.002720 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.002694 2567 generic.go:358] "Generic (PLEG): container finished" podID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerID="368d740d9c8e549815de8e64748410301ec1898d60ad1dfa1f51813b9c6a2e3b" exitCode=0 May 11 20:57:51.002796 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.002772 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" event={"ID":"6301480a-ae94-4125-9ba1-97b9d055b32c","Type":"ContainerDied","Data":"368d740d9c8e549815de8e64748410301ec1898d60ad1dfa1f51813b9c6a2e3b"} May 11 20:57:51.004292 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.004272 2567 generic.go:358] "Generic (PLEG): container finished" podID="0422a798-ac83-4b60-bfee-01964e487d82" containerID="67ec7044815e11cc0e447dfb5140d5c37f11534933ca7ec68da12b6795537bb5" exitCode=0 May 11 20:57:51.004391 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.004361 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" event={"ID":"0422a798-ac83-4b60-bfee-01964e487d82","Type":"ContainerDied","Data":"67ec7044815e11cc0e447dfb5140d5c37f11534933ca7ec68da12b6795537bb5"} May 11 20:57:51.004460 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.004400 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" event={"ID":"0422a798-ac83-4b60-bfee-01964e487d82","Type":"ContainerStarted","Data":"956741bc71b4fba9332b251c506a5af05ef86ca37368f3e89a968613a82e8eca"} May 11 20:57:51.005774 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.005753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" event={"ID":"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0","Type":"ContainerStarted","Data":"56c4c32f7a8b96890bf4b029c9ebc9d9613c2e5f621f53c0d02b347283d0f853"} May 11 20:57:51.005870 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:51.005780 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" event={"ID":"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0","Type":"ContainerStarted","Data":"1144e798e11bb3f4097e1eff7ca67dd8c344d868570f143b0307813ff9f516fb"} May 11 20:57:52.011337 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.011271 2567 generic.go:358] "Generic (PLEG): container finished" podID="0422a798-ac83-4b60-bfee-01964e487d82" containerID="352fd4579aaf347885b3ffa4cfeadef18acbd4e21fa93f6afc4051e8d415c923" exitCode=0 May 11 20:57:52.011664 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.011351 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" event={"ID":"0422a798-ac83-4b60-bfee-01964e487d82","Type":"ContainerDied","Data":"352fd4579aaf347885b3ffa4cfeadef18acbd4e21fa93f6afc4051e8d415c923"} May 11 20:57:52.012950 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.012931 2567 generic.go:358] "Generic (PLEG): container finished" podID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerID="56c4c32f7a8b96890bf4b029c9ebc9d9613c2e5f621f53c0d02b347283d0f853" exitCode=0 May 11 20:57:52.013031 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.012950 2567 generic.go:358] "Generic (PLEG): container finished" podID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerID="e52efe3301a5aff7c6485cc52c8e89b8c8466019b77ac145794dfd789a1bfa53" exitCode=0 May 11 20:57:52.013031 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.012994 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" event={"ID":"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0","Type":"ContainerDied","Data":"56c4c32f7a8b96890bf4b029c9ebc9d9613c2e5f621f53c0d02b347283d0f853"} May 11 20:57:52.013031 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.013019 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" event={"ID":"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0","Type":"ContainerDied","Data":"e52efe3301a5aff7c6485cc52c8e89b8c8466019b77ac145794dfd789a1bfa53"} May 11 20:57:52.014868 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.014845 2567 generic.go:358] "Generic (PLEG): container finished" podID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerID="8b597f4c2176cb4d13eb44f3fb5e475149a15cea4b56d88e5eed14826461dc85" exitCode=0 May 11 20:57:52.014952 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.014908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" event={"ID":"53246992-50b7-4b5a-96c4-a77728a0a4c9","Type":"ContainerDied","Data":"8b597f4c2176cb4d13eb44f3fb5e475149a15cea4b56d88e5eed14826461dc85"} May 11 20:57:52.016783 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.016759 2567 generic.go:358] "Generic (PLEG): container finished" podID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerID="f99975faf45d3478e5ba9937a23f16c5fc781b8bac595781cefe16b4e4090f42" exitCode=0 May 11 20:57:52.016858 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:52.016828 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" event={"ID":"6301480a-ae94-4125-9ba1-97b9d055b32c","Type":"ContainerDied","Data":"f99975faf45d3478e5ba9937a23f16c5fc781b8bac595781cefe16b4e4090f42"} May 11 20:57:53.022446 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.022410 2567 generic.go:358] "Generic (PLEG): container finished" podID="0422a798-ac83-4b60-bfee-01964e487d82" containerID="499b06a595af46574dee30a8ee91a642a8f3fc67eb4c70cbd1343210f81f8d1f" exitCode=0 May 11 20:57:53.022972 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.022497 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" event={"ID":"0422a798-ac83-4b60-bfee-01964e487d82","Type":"ContainerDied","Data":"499b06a595af46574dee30a8ee91a642a8f3fc67eb4c70cbd1343210f81f8d1f"} May 11 20:57:53.024228 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.024203 2567 generic.go:358] "Generic (PLEG): container finished" podID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerID="2061770f6a8d6ae9808a0f865200350d4745aab31dc328d0d7ba022abe8d1264" exitCode=0 May 11 20:57:53.024322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.024271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" event={"ID":"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0","Type":"ContainerDied","Data":"2061770f6a8d6ae9808a0f865200350d4745aab31dc328d0d7ba022abe8d1264"} May 11 20:57:53.151783 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.151761 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:53.185894 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.185875 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:53.243207 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243178 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-bundle\") pod \"6301480a-ae94-4125-9ba1-97b9d055b32c\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " May 11 20:57:53.243330 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243212 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-bundle\") pod \"53246992-50b7-4b5a-96c4-a77728a0a4c9\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " May 11 20:57:53.243330 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243236 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rd22\" (UniqueName: \"kubernetes.io/projected/6301480a-ae94-4125-9ba1-97b9d055b32c-kube-api-access-7rd22\") pod \"6301480a-ae94-4125-9ba1-97b9d055b32c\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " May 11 20:57:53.243330 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243262 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmdh6\" (UniqueName: \"kubernetes.io/projected/53246992-50b7-4b5a-96c4-a77728a0a4c9-kube-api-access-pmdh6\") pod \"53246992-50b7-4b5a-96c4-a77728a0a4c9\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " May 11 20:57:53.243330 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243321 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-util\") pod \"6301480a-ae94-4125-9ba1-97b9d055b32c\" (UID: \"6301480a-ae94-4125-9ba1-97b9d055b32c\") " May 11 20:57:53.243500 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243358 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-util\") pod \"53246992-50b7-4b5a-96c4-a77728a0a4c9\" (UID: \"53246992-50b7-4b5a-96c4-a77728a0a4c9\") " May 11 20:57:53.243939 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243858 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-bundle" (OuterVolumeSpecName: "bundle") pod "53246992-50b7-4b5a-96c4-a77728a0a4c9" (UID: "53246992-50b7-4b5a-96c4-a77728a0a4c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:53.244052 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.243986 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-bundle" (OuterVolumeSpecName: "bundle") pod "6301480a-ae94-4125-9ba1-97b9d055b32c" (UID: "6301480a-ae94-4125-9ba1-97b9d055b32c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:53.245486 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.245463 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53246992-50b7-4b5a-96c4-a77728a0a4c9-kube-api-access-pmdh6" (OuterVolumeSpecName: "kube-api-access-pmdh6") pod "53246992-50b7-4b5a-96c4-a77728a0a4c9" (UID: "53246992-50b7-4b5a-96c4-a77728a0a4c9"). InnerVolumeSpecName "kube-api-access-pmdh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:57:53.245555 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.245484 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6301480a-ae94-4125-9ba1-97b9d055b32c-kube-api-access-7rd22" (OuterVolumeSpecName: "kube-api-access-7rd22") pod "6301480a-ae94-4125-9ba1-97b9d055b32c" (UID: "6301480a-ae94-4125-9ba1-97b9d055b32c"). InnerVolumeSpecName "kube-api-access-7rd22". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:57:53.249102 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.249065 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-util" (OuterVolumeSpecName: "util") pod "53246992-50b7-4b5a-96c4-a77728a0a4c9" (UID: "53246992-50b7-4b5a-96c4-a77728a0a4c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:53.249185 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.249124 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-util" (OuterVolumeSpecName: "util") pod "6301480a-ae94-4125-9ba1-97b9d055b32c" (UID: "6301480a-ae94-4125-9ba1-97b9d055b32c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:53.344147 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.344121 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:53.344256 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.344170 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:53.344256 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.344186 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53246992-50b7-4b5a-96c4-a77728a0a4c9-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:53.344256 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.344200 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rd22\" (UniqueName: \"kubernetes.io/projected/6301480a-ae94-4125-9ba1-97b9d055b32c-kube-api-access-7rd22\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:53.344256 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.344216 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pmdh6\" (UniqueName: \"kubernetes.io/projected/53246992-50b7-4b5a-96c4-a77728a0a4c9-kube-api-access-pmdh6\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:53.344256 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:53.344230 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6301480a-ae94-4125-9ba1-97b9d055b32c-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:54.029724 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.029646 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" event={"ID":"53246992-50b7-4b5a-96c4-a77728a0a4c9","Type":"ContainerDied","Data":"b00b90ccb41eb2e7a1eccfff4937ad9db96ba4b8bd853fe5de8f8823147c0230"} May 11 20:57:54.029724 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.029679 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b00b90ccb41eb2e7a1eccfff4937ad9db96ba4b8bd853fe5de8f8823147c0230" May 11 20:57:54.029724 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.029677 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf" May 11 20:57:54.031397 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.031363 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" event={"ID":"6301480a-ae94-4125-9ba1-97b9d055b32c","Type":"ContainerDied","Data":"32c5eb678500f435b3c372d1999e91564380de75fcbcc734456e0b1c90c222c0"} May 11 20:57:54.031397 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.031398 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c5eb678500f435b3c372d1999e91564380de75fcbcc734456e0b1c90c222c0" May 11 20:57:54.031718 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.031699 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9" May 11 20:57:54.165652 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.165628 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:57:54.191523 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.191501 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:54.250763 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.250736 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmxq9\" (UniqueName: \"kubernetes.io/projected/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-kube-api-access-tmxq9\") pod \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " May 11 20:57:54.250891 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.250779 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkd2b\" (UniqueName: \"kubernetes.io/projected/0422a798-ac83-4b60-bfee-01964e487d82-kube-api-access-fkd2b\") pod \"0422a798-ac83-4b60-bfee-01964e487d82\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " May 11 20:57:54.250891 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.250822 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-bundle\") pod \"0422a798-ac83-4b60-bfee-01964e487d82\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " May 11 20:57:54.250891 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.250843 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-util\") pod \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " May 11 20:57:54.251085 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.250896 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-util\") pod \"0422a798-ac83-4b60-bfee-01964e487d82\" (UID: \"0422a798-ac83-4b60-bfee-01964e487d82\") " May 11 20:57:54.251085 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.250944 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-bundle\") pod \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\" (UID: \"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0\") " May 11 20:57:54.251459 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.251397 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-bundle" (OuterVolumeSpecName: "bundle") pod "0422a798-ac83-4b60-bfee-01964e487d82" (UID: "0422a798-ac83-4b60-bfee-01964e487d82"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:54.251671 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.251633 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-bundle" (OuterVolumeSpecName: "bundle") pod "0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" (UID: "0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:54.252907 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.252879 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-kube-api-access-tmxq9" (OuterVolumeSpecName: "kube-api-access-tmxq9") pod "0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" (UID: "0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0"). InnerVolumeSpecName "kube-api-access-tmxq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:57:54.253031 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.252913 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0422a798-ac83-4b60-bfee-01964e487d82-kube-api-access-fkd2b" (OuterVolumeSpecName: "kube-api-access-fkd2b") pod "0422a798-ac83-4b60-bfee-01964e487d82" (UID: "0422a798-ac83-4b60-bfee-01964e487d82"). InnerVolumeSpecName "kube-api-access-fkd2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:57:54.255987 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.255946 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-util" (OuterVolumeSpecName: "util") pod "0422a798-ac83-4b60-bfee-01964e487d82" (UID: "0422a798-ac83-4b60-bfee-01964e487d82"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:54.256357 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.256339 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-util" (OuterVolumeSpecName: "util") pod "0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" (UID: "0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:57:54.352484 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.352462 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:54.352484 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.352483 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:54.352613 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.352492 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0422a798-ac83-4b60-bfee-01964e487d82-util\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:54.352613 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.352500 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:54.352613 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.352509 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmxq9\" (UniqueName: \"kubernetes.io/projected/0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0-kube-api-access-tmxq9\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:54.352613 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:54.352518 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkd2b\" (UniqueName: \"kubernetes.io/projected/0422a798-ac83-4b60-bfee-01964e487d82-kube-api-access-fkd2b\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:57:55.036266 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:55.036173 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" event={"ID":"0422a798-ac83-4b60-bfee-01964e487d82","Type":"ContainerDied","Data":"956741bc71b4fba9332b251c506a5af05ef86ca37368f3e89a968613a82e8eca"} May 11 20:57:55.036266 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:55.036212 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="956741bc71b4fba9332b251c506a5af05ef86ca37368f3e89a968613a82e8eca" May 11 20:57:55.036266 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:55.036235 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw" May 11 20:57:55.037905 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:55.037878 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" event={"ID":"0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0","Type":"ContainerDied","Data":"1144e798e11bb3f4097e1eff7ca67dd8c344d868570f143b0307813ff9f516fb"} May 11 20:57:55.038049 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:55.037908 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1144e798e11bb3f4097e1eff7ca67dd8c344d868570f143b0307813ff9f516fb" May 11 20:57:55.038049 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:57:55.037936 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd" May 11 20:58:03.154951 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.154918 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-fr92f"] May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155259 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerName="pull" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155272 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerName="pull" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155281 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerName="util" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155286 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerName="util" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155293 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerName="extract" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155300 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerName="extract" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155309 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerName="pull" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155316 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerName="pull" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155322 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0422a798-ac83-4b60-bfee-01964e487d82" containerName="extract" May 11 20:58:03.155322 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155327 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0422a798-ac83-4b60-bfee-01964e487d82" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155335 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerName="util" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155340 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerName="util" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155350 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerName="pull" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155355 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerName="pull" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155361 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerName="util" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155365 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerName="util" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155374 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155379 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155390 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155398 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155408 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0422a798-ac83-4b60-bfee-01964e487d82" containerName="util" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155413 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0422a798-ac83-4b60-bfee-01964e487d82" containerName="util" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155420 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0422a798-ac83-4b60-bfee-01964e487d82" containerName="pull" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155425 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0422a798-ac83-4b60-bfee-01964e487d82" containerName="pull" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155475 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6301480a-ae94-4125-9ba1-97b9d055b32c" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155486 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0422a798-ac83-4b60-bfee-01964e487d82" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155492 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0" containerName="extract" May 11 20:58:03.155631 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.155499 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="53246992-50b7-4b5a-96c4-a77728a0a4c9" containerName="extract" May 11 20:58:03.157628 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.157613 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-fr92f" May 11 20:58:03.160849 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.160830 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-9bb5l\"" May 11 20:58:03.174008 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.173984 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-fr92f"] May 11 20:58:03.217844 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.217814 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v92b\" (UniqueName: \"kubernetes.io/projected/6310b47c-4fb1-4384-811f-43c2fee5c54e-kube-api-access-7v92b\") pod \"authorino-operator-657f44b778-fr92f\" (UID: \"6310b47c-4fb1-4384-811f-43c2fee5c54e\") " pod="kuadrant-system/authorino-operator-657f44b778-fr92f" May 11 20:58:03.318956 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.318921 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v92b\" (UniqueName: \"kubernetes.io/projected/6310b47c-4fb1-4384-811f-43c2fee5c54e-kube-api-access-7v92b\") pod \"authorino-operator-657f44b778-fr92f\" (UID: \"6310b47c-4fb1-4384-811f-43c2fee5c54e\") " pod="kuadrant-system/authorino-operator-657f44b778-fr92f" May 11 20:58:03.333544 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.333510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v92b\" (UniqueName: \"kubernetes.io/projected/6310b47c-4fb1-4384-811f-43c2fee5c54e-kube-api-access-7v92b\") pod \"authorino-operator-657f44b778-fr92f\" (UID: \"6310b47c-4fb1-4384-811f-43c2fee5c54e\") " pod="kuadrant-system/authorino-operator-657f44b778-fr92f" May 11 20:58:03.467537 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.467445 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-fr92f" May 11 20:58:03.611132 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:03.611105 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-fr92f"] May 11 20:58:03.612530 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:58:03.612503 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6310b47c_4fb1_4384_811f_43c2fee5c54e.slice/crio-1cb3757059765c7513ddbee8bcc10a0acb2d9e58becd06e98a6ed87dd8fb2f6a WatchSource:0}: Error finding container 1cb3757059765c7513ddbee8bcc10a0acb2d9e58becd06e98a6ed87dd8fb2f6a: Status 404 returned error can't find the container with id 1cb3757059765c7513ddbee8bcc10a0acb2d9e58becd06e98a6ed87dd8fb2f6a May 11 20:58:04.074392 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:04.074357 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-fr92f" event={"ID":"6310b47c-4fb1-4384-811f-43c2fee5c54e","Type":"ContainerStarted","Data":"1cb3757059765c7513ddbee8bcc10a0acb2d9e58becd06e98a6ed87dd8fb2f6a"} May 11 20:58:06.082446 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:06.082414 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-fr92f" event={"ID":"6310b47c-4fb1-4384-811f-43c2fee5c54e","Type":"ContainerStarted","Data":"7561487b1dab423de2fd1ae75debe71faac193fd6d5acadb70b2cf560c3015f9"} May 11 20:58:06.082833 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:06.082575 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-fr92f" May 11 20:58:06.110566 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:06.110512 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-fr92f" podStartSLOduration=1.070711577 podStartE2EDuration="3.110499367s" podCreationTimestamp="2026-05-11 20:58:03 +0000 UTC" firstStartedPulling="2026-05-11 20:58:03.614464754 +0000 UTC m=+461.724245177" lastFinishedPulling="2026-05-11 20:58:05.654252538 +0000 UTC m=+463.764032967" observedRunningTime="2026-05-11 20:58:06.107445704 +0000 UTC m=+464.217226149" watchObservedRunningTime="2026-05-11 20:58:06.110499367 +0000 UTC m=+464.220279813" May 11 20:58:11.551146 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.551108 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2"] May 11 20:58:11.555012 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.554996 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:11.557674 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.557656 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-mb652\"" May 11 20:58:11.567546 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.567521 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2"] May 11 20:58:11.684955 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.684928 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02358b50-78e5-4c0b-ab77-5082f7aed3b2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:11.685117 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.685036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss52z\" (UniqueName: \"kubernetes.io/projected/02358b50-78e5-4c0b-ab77-5082f7aed3b2-kube-api-access-ss52z\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:11.785972 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.785933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02358b50-78e5-4c0b-ab77-5082f7aed3b2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:11.786084 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.786026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss52z\" (UniqueName: \"kubernetes.io/projected/02358b50-78e5-4c0b-ab77-5082f7aed3b2-kube-api-access-ss52z\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:11.786274 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.786253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02358b50-78e5-4c0b-ab77-5082f7aed3b2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:11.800409 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.800389 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss52z\" (UniqueName: \"kubernetes.io/projected/02358b50-78e5-4c0b-ab77-5082f7aed3b2-kube-api-access-ss52z\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:11.864654 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.864601 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:12.002784 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:11.997378 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2"] May 11 20:58:12.105668 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:12.105638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" event={"ID":"02358b50-78e5-4c0b-ab77-5082f7aed3b2","Type":"ContainerStarted","Data":"58b6f4337eb8ea52833b7fbe06f67d243e69f2f45ffc68b7a8eacb6a86770f51"} May 11 20:58:15.042859 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.042829 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d5898bbc6-cltd4"] May 11 20:58:15.050857 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.050835 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.055865 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.055842 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5898bbc6-cltd4"] May 11 20:58:15.113240 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.113204 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-serving-cert\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.113404 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.113261 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-oauth-serving-cert\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.113404 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.113291 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72v9\" (UniqueName: \"kubernetes.io/projected/4c23fe17-11fe-4f20-af40-7bde5e75d825-kube-api-access-d72v9\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.113404 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.113327 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-trusted-ca-bundle\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.113562 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.113427 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-oauth-config\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.113562 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.113459 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-config\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.113562 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.113525 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-service-ca\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.214068 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.214031 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-serving-cert\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.214203 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.214079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-oauth-serving-cert\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.214203 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.214113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d72v9\" (UniqueName: \"kubernetes.io/projected/4c23fe17-11fe-4f20-af40-7bde5e75d825-kube-api-access-d72v9\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.214203 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.214137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-trusted-ca-bundle\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.214203 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.214174 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-oauth-config\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.214370 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.214237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-config\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.214370 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.214284 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-service-ca\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.215048 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.215018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-config\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.215183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.215063 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-service-ca\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.215183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.215084 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-trusted-ca-bundle\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.215183 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.215101 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c23fe17-11fe-4f20-af40-7bde5e75d825-oauth-serving-cert\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.216576 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.216550 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-oauth-config\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.216689 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.216676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c23fe17-11fe-4f20-af40-7bde5e75d825-console-serving-cert\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.225932 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.225912 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72v9\" (UniqueName: \"kubernetes.io/projected/4c23fe17-11fe-4f20-af40-7bde5e75d825-kube-api-access-d72v9\") pod \"console-5d5898bbc6-cltd4\" (UID: \"4c23fe17-11fe-4f20-af40-7bde5e75d825\") " pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.363385 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.363327 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:15.494022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:15.493993 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5898bbc6-cltd4"] May 11 20:58:15.495757 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:58:15.495726 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c23fe17_11fe_4f20_af40_7bde5e75d825.slice/crio-456e28cb9f8e292bf43930cd56a50fe294b86c958b6f9905d7806f42916d5607 WatchSource:0}: Error finding container 456e28cb9f8e292bf43930cd56a50fe294b86c958b6f9905d7806f42916d5607: Status 404 returned error can't find the container with id 456e28cb9f8e292bf43930cd56a50fe294b86c958b6f9905d7806f42916d5607 May 11 20:58:16.123698 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:16.123662 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5898bbc6-cltd4" event={"ID":"4c23fe17-11fe-4f20-af40-7bde5e75d825","Type":"ContainerStarted","Data":"efce4ab0d4b6793db70e3f6b5b7abc5a8983aacda98889a03f4b0e5162499675"} May 11 20:58:16.123698 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:16.123703 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5898bbc6-cltd4" event={"ID":"4c23fe17-11fe-4f20-af40-7bde5e75d825","Type":"ContainerStarted","Data":"456e28cb9f8e292bf43930cd56a50fe294b86c958b6f9905d7806f42916d5607"} May 11 20:58:16.142889 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:16.142848 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d5898bbc6-cltd4" podStartSLOduration=1.142832938 podStartE2EDuration="1.142832938s" podCreationTimestamp="2026-05-11 20:58:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:58:16.139954941 +0000 UTC m=+474.249735412" watchObservedRunningTime="2026-05-11 20:58:16.142832938 +0000 UTC m=+474.252613383" May 11 20:58:17.088616 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:17.088587 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-fr92f" May 11 20:58:17.130541 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:17.130490 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" event={"ID":"02358b50-78e5-4c0b-ab77-5082f7aed3b2","Type":"ContainerStarted","Data":"13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff"} May 11 20:58:17.130986 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:17.130758 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:17.151037 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:17.150994 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" podStartSLOduration=1.397718978 podStartE2EDuration="6.15098148s" podCreationTimestamp="2026-05-11 20:58:11 +0000 UTC" firstStartedPulling="2026-05-11 20:58:12.004754616 +0000 UTC m=+470.114535054" lastFinishedPulling="2026-05-11 20:58:16.758017134 +0000 UTC m=+474.867797556" observedRunningTime="2026-05-11 20:58:17.149196994 +0000 UTC m=+475.258977444" watchObservedRunningTime="2026-05-11 20:58:17.15098148 +0000 UTC m=+475.260761924" May 11 20:58:25.363772 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:25.363742 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:25.364327 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:25.364059 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:25.368503 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:25.368484 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:26.165800 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:26.165775 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d5898bbc6-cltd4" May 11 20:58:26.218526 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:26.218494 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b596b74fc-kt9hj"] May 11 20:58:28.136271 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:28.136244 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:29.821239 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:29.821208 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2"] May 11 20:58:29.821597 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:29.821395 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" containerName="manager" containerID="cri-o://13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff" gracePeriod=2 May 11 20:58:29.824717 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:29.824686 2567 status_manager.go:895] "Failed to get status for pod" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" is forbidden: User \"system:node:ip-10-0-128-58.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-58.ec2.internal' and this object" May 11 20:58:29.826180 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:29.826156 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2"] May 11 20:58:30.054341 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.054319 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:30.056691 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.056666 2567 status_manager.go:895] "Failed to get status for pod" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" is forbidden: User \"system:node:ip-10-0-128-58.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-58.ec2.internal' and this object" May 11 20:58:30.134143 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.134072 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss52z\" (UniqueName: \"kubernetes.io/projected/02358b50-78e5-4c0b-ab77-5082f7aed3b2-kube-api-access-ss52z\") pod \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " May 11 20:58:30.134268 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.134193 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02358b50-78e5-4c0b-ab77-5082f7aed3b2-extensions-socket-volume\") pod \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\" (UID: \"02358b50-78e5-4c0b-ab77-5082f7aed3b2\") " May 11 20:58:30.134547 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.134513 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02358b50-78e5-4c0b-ab77-5082f7aed3b2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "02358b50-78e5-4c0b-ab77-5082f7aed3b2" (UID: "02358b50-78e5-4c0b-ab77-5082f7aed3b2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:58:30.136073 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.136052 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02358b50-78e5-4c0b-ab77-5082f7aed3b2-kube-api-access-ss52z" (OuterVolumeSpecName: "kube-api-access-ss52z") pod "02358b50-78e5-4c0b-ab77-5082f7aed3b2" (UID: "02358b50-78e5-4c0b-ab77-5082f7aed3b2"). InnerVolumeSpecName "kube-api-access-ss52z". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:58:30.139308 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.139287 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft"] May 11 20:58:30.139630 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.139618 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" containerName="manager" May 11 20:58:30.139674 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.139633 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" containerName="manager" May 11 20:58:30.139738 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.139729 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" containerName="manager" May 11 20:58:30.142991 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.142975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.145369 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.145347 2567 status_manager.go:895] "Failed to get status for pod" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" is forbidden: User \"system:node:ip-10-0-128-58.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-58.ec2.internal' and this object" May 11 20:58:30.157029 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.157009 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft"] May 11 20:58:30.178952 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.178925 2567 generic.go:358] "Generic (PLEG): container finished" podID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" containerID="13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff" exitCode=0 May 11 20:58:30.179080 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.178985 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" May 11 20:58:30.179080 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.178999 2567 scope.go:117] "RemoveContainer" containerID="13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff" May 11 20:58:30.181473 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.181447 2567 status_manager.go:895] "Failed to get status for pod" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" is forbidden: User \"system:node:ip-10-0-128-58.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-58.ec2.internal' and this object" May 11 20:58:30.187759 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.187741 2567 scope.go:117] "RemoveContainer" containerID="13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff" May 11 20:58:30.188021 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:58:30.188004 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff\": container with ID starting with 13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff not found: ID does not exist" containerID="13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff" May 11 20:58:30.188072 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.188031 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff"} err="failed to get container status \"13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff\": rpc error: code = NotFound desc = could not find container \"13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff\": container with ID starting with 13576a6ce9f59bce2f893afac58752954acb68b177588460c33fe3417b2504ff not found: ID does not exist" May 11 20:58:30.189529 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.189508 2567 status_manager.go:895] "Failed to get status for pod" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-zdwq2" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-zdwq2\" is forbidden: User \"system:node:ip-10-0-128-58.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-128-58.ec2.internal' and this object" May 11 20:58:30.234909 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.234888 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rtmk\" (UniqueName: \"kubernetes.io/projected/32717441-44d7-49e9-a189-9d38dbf3bd7a-kube-api-access-6rtmk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nrwft\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.235022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.234928 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32717441-44d7-49e9-a189-9d38dbf3bd7a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nrwft\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.235022 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.235018 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/02358b50-78e5-4c0b-ab77-5082f7aed3b2-extensions-socket-volume\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:30.235108 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.235029 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss52z\" (UniqueName: \"kubernetes.io/projected/02358b50-78e5-4c0b-ab77-5082f7aed3b2-kube-api-access-ss52z\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:30.336299 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.336271 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32717441-44d7-49e9-a189-9d38dbf3bd7a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nrwft\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.336379 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.336326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rtmk\" (UniqueName: \"kubernetes.io/projected/32717441-44d7-49e9-a189-9d38dbf3bd7a-kube-api-access-6rtmk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nrwft\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.336605 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.336587 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32717441-44d7-49e9-a189-9d38dbf3bd7a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nrwft\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.348313 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.348287 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rtmk\" (UniqueName: \"kubernetes.io/projected/32717441-44d7-49e9-a189-9d38dbf3bd7a-kube-api-access-6rtmk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nrwft\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.452234 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.452180 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:30.486101 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.486045 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02358b50-78e5-4c0b-ab77-5082f7aed3b2" path="/var/lib/kubelet/pods/02358b50-78e5-4c0b-ab77-5082f7aed3b2/volumes" May 11 20:58:30.583695 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:30.583668 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft"] May 11 20:58:30.585145 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:58:30.585117 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32717441_44d7_49e9_a189_9d38dbf3bd7a.slice/crio-ab1b4244b43b2faa502ead8edafebaf3e07e1a1bae99dccc91738c468c1ff6b8 WatchSource:0}: Error finding container ab1b4244b43b2faa502ead8edafebaf3e07e1a1bae99dccc91738c468c1ff6b8: Status 404 returned error can't find the container with id ab1b4244b43b2faa502ead8edafebaf3e07e1a1bae99dccc91738c468c1ff6b8 May 11 20:58:31.184371 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:31.184332 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" event={"ID":"32717441-44d7-49e9-a189-9d38dbf3bd7a","Type":"ContainerStarted","Data":"a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83"} May 11 20:58:31.184371 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:31.184371 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" event={"ID":"32717441-44d7-49e9-a189-9d38dbf3bd7a","Type":"ContainerStarted","Data":"ab1b4244b43b2faa502ead8edafebaf3e07e1a1bae99dccc91738c468c1ff6b8"} May 11 20:58:31.184876 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:31.184401 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:31.210175 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:31.210137 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" podStartSLOduration=1.210124599 podStartE2EDuration="1.210124599s" podCreationTimestamp="2026-05-11 20:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:58:31.207858305 +0000 UTC m=+489.317638774" watchObservedRunningTime="2026-05-11 20:58:31.210124599 +0000 UTC m=+489.319905044" May 11 20:58:42.191328 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:42.191293 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 20:58:51.240104 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.240042 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b596b74fc-kt9hj" podUID="856bf9b1-3cba-494a-a778-81c13fdab888" containerName="console" containerID="cri-o://6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e" gracePeriod=15 May 11 20:58:51.482494 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.482472 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b596b74fc-kt9hj_856bf9b1-3cba-494a-a778-81c13fdab888/console/0.log" May 11 20:58:51.482622 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.482550 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:58:51.598513 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.598489 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5nf5\" (UniqueName: \"kubernetes.io/projected/856bf9b1-3cba-494a-a778-81c13fdab888-kube-api-access-p5nf5\") pod \"856bf9b1-3cba-494a-a778-81c13fdab888\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " May 11 20:58:51.598672 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.598567 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-trusted-ca-bundle\") pod \"856bf9b1-3cba-494a-a778-81c13fdab888\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " May 11 20:58:51.598672 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.598590 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-oauth-config\") pod \"856bf9b1-3cba-494a-a778-81c13fdab888\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " May 11 20:58:51.598672 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.598611 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-service-ca\") pod \"856bf9b1-3cba-494a-a778-81c13fdab888\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " May 11 20:58:51.598672 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.598638 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-oauth-serving-cert\") pod \"856bf9b1-3cba-494a-a778-81c13fdab888\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " May 11 20:58:51.598672 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.598660 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-console-config\") pod \"856bf9b1-3cba-494a-a778-81c13fdab888\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " May 11 20:58:51.598929 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.598700 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-serving-cert\") pod \"856bf9b1-3cba-494a-a778-81c13fdab888\" (UID: \"856bf9b1-3cba-494a-a778-81c13fdab888\") " May 11 20:58:51.599098 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.599076 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "856bf9b1-3cba-494a-a778-81c13fdab888" (UID: "856bf9b1-3cba-494a-a778-81c13fdab888"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:58:51.599162 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.599098 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-service-ca" (OuterVolumeSpecName: "service-ca") pod "856bf9b1-3cba-494a-a778-81c13fdab888" (UID: "856bf9b1-3cba-494a-a778-81c13fdab888"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:58:51.599208 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.599146 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-console-config" (OuterVolumeSpecName: "console-config") pod "856bf9b1-3cba-494a-a778-81c13fdab888" (UID: "856bf9b1-3cba-494a-a778-81c13fdab888"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:58:51.599333 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.599309 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "856bf9b1-3cba-494a-a778-81c13fdab888" (UID: "856bf9b1-3cba-494a-a778-81c13fdab888"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:58:51.600689 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.600662 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856bf9b1-3cba-494a-a778-81c13fdab888-kube-api-access-p5nf5" (OuterVolumeSpecName: "kube-api-access-p5nf5") pod "856bf9b1-3cba-494a-a778-81c13fdab888" (UID: "856bf9b1-3cba-494a-a778-81c13fdab888"). InnerVolumeSpecName "kube-api-access-p5nf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:58:51.600794 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.600735 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "856bf9b1-3cba-494a-a778-81c13fdab888" (UID: "856bf9b1-3cba-494a-a778-81c13fdab888"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:58:51.600794 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.600756 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "856bf9b1-3cba-494a-a778-81c13fdab888" (UID: "856bf9b1-3cba-494a-a778-81c13fdab888"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:58:51.699506 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.699482 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-trusted-ca-bundle\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:51.699506 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.699505 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-oauth-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:51.699661 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.699515 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-service-ca\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:51.699661 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.699524 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-oauth-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:51.699661 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.699533 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/856bf9b1-3cba-494a-a778-81c13fdab888-console-config\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:51.699661 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.699542 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/856bf9b1-3cba-494a-a778-81c13fdab888-console-serving-cert\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:51.699661 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:51.699550 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5nf5\" (UniqueName: \"kubernetes.io/projected/856bf9b1-3cba-494a-a778-81c13fdab888-kube-api-access-p5nf5\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 20:58:52.265980 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.265942 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b596b74fc-kt9hj_856bf9b1-3cba-494a-a778-81c13fdab888/console/0.log" May 11 20:58:52.266364 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.265998 2567 generic.go:358] "Generic (PLEG): container finished" podID="856bf9b1-3cba-494a-a778-81c13fdab888" containerID="6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e" exitCode=2 May 11 20:58:52.266364 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.266065 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b596b74fc-kt9hj" May 11 20:58:52.266364 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.266079 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b596b74fc-kt9hj" event={"ID":"856bf9b1-3cba-494a-a778-81c13fdab888","Type":"ContainerDied","Data":"6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e"} May 11 20:58:52.266364 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.266104 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b596b74fc-kt9hj" event={"ID":"856bf9b1-3cba-494a-a778-81c13fdab888","Type":"ContainerDied","Data":"1309458bdb834a2db4edd4d5bc087e37313c6396e658bf84efeb7366e1bc41e4"} May 11 20:58:52.266364 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.266118 2567 scope.go:117] "RemoveContainer" containerID="6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e" May 11 20:58:52.275743 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.275725 2567 scope.go:117] "RemoveContainer" containerID="6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e" May 11 20:58:52.275986 ip-10-0-128-58 kubenswrapper[2567]: E0511 20:58:52.275951 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e\": container with ID starting with 6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e not found: ID does not exist" containerID="6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e" May 11 20:58:52.276036 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.275997 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e"} err="failed to get container status \"6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e\": rpc error: code = NotFound desc = could not find container \"6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e\": container with ID starting with 6a56e11b8b9e4ed4e1cf892b40032ac6eabc61416e6374c09ca82223d79d0a9e not found: ID does not exist" May 11 20:58:52.288896 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.288875 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b596b74fc-kt9hj"] May 11 20:58:52.292446 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.292424 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b596b74fc-kt9hj"] May 11 20:58:52.480266 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:58:52.480234 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856bf9b1-3cba-494a-a778-81c13fdab888" path="/var/lib/kubelet/pods/856bf9b1-3cba-494a-a778-81c13fdab888/volumes" May 11 20:59:03.775556 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.775524 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 20:59:03.776014 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.775899 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="856bf9b1-3cba-494a-a778-81c13fdab888" containerName="console" May 11 20:59:03.776014 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.775912 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="856bf9b1-3cba-494a-a778-81c13fdab888" containerName="console" May 11 20:59:03.776014 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.775986 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="856bf9b1-3cba-494a-a778-81c13fdab888" containerName="console" May 11 20:59:03.778818 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.778798 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:03.781865 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.781832 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8ksxs\"" May 11 20:59:03.781865 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.781832 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" May 11 20:59:03.787766 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.787743 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 20:59:03.815799 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.815766 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 20:59:03.887092 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.887036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a8aadf5-8168-4a83-95ef-b1d014ffe034-config-file\") pod \"limitador-limitador-78c99df468-jck9s\" (UID: \"5a8aadf5-8168-4a83-95ef-b1d014ffe034\") " pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:03.887275 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.887178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84mdt\" (UniqueName: \"kubernetes.io/projected/5a8aadf5-8168-4a83-95ef-b1d014ffe034-kube-api-access-84mdt\") pod \"limitador-limitador-78c99df468-jck9s\" (UID: \"5a8aadf5-8168-4a83-95ef-b1d014ffe034\") " pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:03.988552 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.988501 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84mdt\" (UniqueName: \"kubernetes.io/projected/5a8aadf5-8168-4a83-95ef-b1d014ffe034-kube-api-access-84mdt\") pod \"limitador-limitador-78c99df468-jck9s\" (UID: \"5a8aadf5-8168-4a83-95ef-b1d014ffe034\") " pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:03.988731 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.988576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a8aadf5-8168-4a83-95ef-b1d014ffe034-config-file\") pod \"limitador-limitador-78c99df468-jck9s\" (UID: \"5a8aadf5-8168-4a83-95ef-b1d014ffe034\") " pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:03.989281 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.989260 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a8aadf5-8168-4a83-95ef-b1d014ffe034-config-file\") pod \"limitador-limitador-78c99df468-jck9s\" (UID: \"5a8aadf5-8168-4a83-95ef-b1d014ffe034\") " pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:03.997937 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:03.997910 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84mdt\" (UniqueName: \"kubernetes.io/projected/5a8aadf5-8168-4a83-95ef-b1d014ffe034-kube-api-access-84mdt\") pod \"limitador-limitador-78c99df468-jck9s\" (UID: \"5a8aadf5-8168-4a83-95ef-b1d014ffe034\") " pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:04.089578 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:04.089551 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:04.222268 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:04.222236 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 20:59:04.223452 ip-10-0-128-58 kubenswrapper[2567]: W0511 20:59:04.223430 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8aadf5_8168_4a83_95ef_b1d014ffe034.slice/crio-ec9fd20260a79d10367d592e2c15c143cf510855c68043e73743dee96c8b263a WatchSource:0}: Error finding container ec9fd20260a79d10367d592e2c15c143cf510855c68043e73743dee96c8b263a: Status 404 returned error can't find the container with id ec9fd20260a79d10367d592e2c15c143cf510855c68043e73743dee96c8b263a May 11 20:59:04.314220 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:04.314189 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" event={"ID":"5a8aadf5-8168-4a83-95ef-b1d014ffe034","Type":"ContainerStarted","Data":"ec9fd20260a79d10367d592e2c15c143cf510855c68043e73743dee96c8b263a"} May 11 20:59:07.328128 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:07.328083 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" event={"ID":"5a8aadf5-8168-4a83-95ef-b1d014ffe034","Type":"ContainerStarted","Data":"c91d8b3bc4cc32eb04be8efc5782af7700d32d151593fabd5d9817d219ca7f2d"} May 11 20:59:07.328496 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:07.328189 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 20:59:07.346512 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:07.346471 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" podStartSLOduration=1.881818216 podStartE2EDuration="4.346458756s" podCreationTimestamp="2026-05-11 20:59:03 +0000 UTC" firstStartedPulling="2026-05-11 20:59:04.22517386 +0000 UTC m=+522.334954284" lastFinishedPulling="2026-05-11 20:59:06.689814396 +0000 UTC m=+524.799594824" observedRunningTime="2026-05-11 20:59:07.34436883 +0000 UTC m=+525.454149275" watchObservedRunningTime="2026-05-11 20:59:07.346458756 +0000 UTC m=+525.456239237" May 11 20:59:18.333786 ip-10-0-128-58 kubenswrapper[2567]: I0511 20:59:18.333755 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-jck9s" May 11 21:00:04.604392 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:04.604354 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:00:19.793047 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:19.793013 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:00:22.423703 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:22.423676 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:00:22.426480 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:22.426460 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:00:27.092441 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:27.092413 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:00:30.087820 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:30.087785 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:00:35.990544 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:35.990414 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:00:39.591361 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:39.591329 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:00:49.889821 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:00:49.889786 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:01:47.998919 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:01:47.998882 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:01:58.394707 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:01:58.394669 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:02:07.295950 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:02:07.295908 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:02:17.494130 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:02:17.494102 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:02:26.396292 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:02:26.396252 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:02:36.889893 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:02:36.889860 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:03:38.393976 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:03:38.393881 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:03:53.984401 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:03:53.984363 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:04:32.689484 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:04:32.689450 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:04:49.590480 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:04:49.590451 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:05:04.091478 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:05:04.091444 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:05:20.388802 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:05:20.388769 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:05:22.458570 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:05:22.458543 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:05:22.463073 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:05:22.463050 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:06:14.388714 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:06:14.388673 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:06:23.701684 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:06:23.701651 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:06:40.393212 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:06:40.393134 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:06:48.888465 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:06:48.888429 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:07:06.082658 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:07:06.082620 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:07:14.084419 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:07:14.084382 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:07:46.686117 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:07:46.686076 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:07:54.594461 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:07:54.594427 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:08:02.682795 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:08:02.682759 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:08:11.283895 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:08:11.283854 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:08:20.489906 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:08:20.489872 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:08:36.985553 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:08:36.985522 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:08:47.688401 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:08:47.688370 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:09:35.217317 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:09:35.217119 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:09:39.990524 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:09:39.990490 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:09:47.991897 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:09:47.991863 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:09:56.689025 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:09:56.688988 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:10:05.685692 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:05.685655 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:10:14.185171 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:14.185137 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:10:22.497817 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:22.497786 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:10:22.502208 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:22.502190 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:10:23.287571 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:23.287537 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:10:31.590541 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:31.590502 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:10:40.394270 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:40.394236 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:10:48.992684 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:48.992649 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:10:58.394555 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:10:58.394514 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:11:06.584551 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:11:06.584502 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:11:15.489315 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:11:15.489274 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:11:23.683808 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:11:23.683772 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:11:33.282623 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:11:33.282581 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:11:40.583383 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:11:40.583354 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:11:50.702537 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:11:50.702495 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:11:58.594376 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:11:58.594342 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:12:50.104238 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:50.104160 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft"] May 11 21:12:50.104767 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:50.104390 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" podUID="32717441-44d7-49e9-a189-9d38dbf3bd7a" containerName="manager" containerID="cri-o://a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83" gracePeriod=10 May 11 21:12:50.859546 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:50.859525 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 21:12:50.982758 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:50.982674 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32717441-44d7-49e9-a189-9d38dbf3bd7a-extensions-socket-volume\") pod \"32717441-44d7-49e9-a189-9d38dbf3bd7a\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " May 11 21:12:50.982758 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:50.982709 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rtmk\" (UniqueName: \"kubernetes.io/projected/32717441-44d7-49e9-a189-9d38dbf3bd7a-kube-api-access-6rtmk\") pod \"32717441-44d7-49e9-a189-9d38dbf3bd7a\" (UID: \"32717441-44d7-49e9-a189-9d38dbf3bd7a\") " May 11 21:12:50.983112 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:50.983087 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32717441-44d7-49e9-a189-9d38dbf3bd7a-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "32717441-44d7-49e9-a189-9d38dbf3bd7a" (UID: "32717441-44d7-49e9-a189-9d38dbf3bd7a"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 21:12:50.984634 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:50.984614 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32717441-44d7-49e9-a189-9d38dbf3bd7a-kube-api-access-6rtmk" (OuterVolumeSpecName: "kube-api-access-6rtmk") pod "32717441-44d7-49e9-a189-9d38dbf3bd7a" (UID: "32717441-44d7-49e9-a189-9d38dbf3bd7a"). InnerVolumeSpecName "kube-api-access-6rtmk". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 21:12:51.083664 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.083636 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/32717441-44d7-49e9-a189-9d38dbf3bd7a-extensions-socket-volume\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 21:12:51.083664 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.083660 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rtmk\" (UniqueName: \"kubernetes.io/projected/32717441-44d7-49e9-a189-9d38dbf3bd7a-kube-api-access-6rtmk\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 21:12:51.349622 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.349593 2567 generic.go:358] "Generic (PLEG): container finished" podID="32717441-44d7-49e9-a189-9d38dbf3bd7a" containerID="a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83" exitCode=0 May 11 21:12:51.350041 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.349661 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" May 11 21:12:51.350041 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.349681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" event={"ID":"32717441-44d7-49e9-a189-9d38dbf3bd7a","Type":"ContainerDied","Data":"a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83"} May 11 21:12:51.350041 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.349722 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft" event={"ID":"32717441-44d7-49e9-a189-9d38dbf3bd7a","Type":"ContainerDied","Data":"ab1b4244b43b2faa502ead8edafebaf3e07e1a1bae99dccc91738c468c1ff6b8"} May 11 21:12:51.350041 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.349741 2567 scope.go:117] "RemoveContainer" containerID="a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83" May 11 21:12:51.359171 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.359155 2567 scope.go:117] "RemoveContainer" containerID="a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83" May 11 21:12:51.359392 ip-10-0-128-58 kubenswrapper[2567]: E0511 21:12:51.359371 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83\": container with ID starting with a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83 not found: ID does not exist" containerID="a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83" May 11 21:12:51.359438 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.359399 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83"} err="failed to get container status \"a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83\": rpc error: code = NotFound desc = could not find container \"a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83\": container with ID starting with a4942ff8a5d0901bd129486cfb465f087d5767bc0ac2c2df2b024c6a1e73ba83 not found: ID does not exist" May 11 21:12:51.371818 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.371798 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft"] May 11 21:12:51.376836 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:51.376816 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nrwft"] May 11 21:12:52.480151 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:12:52.480116 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32717441-44d7-49e9-a189-9d38dbf3bd7a" path="/var/lib/kubelet/pods/32717441-44d7-49e9-a189-9d38dbf3bd7a/volumes" May 11 21:13:56.172193 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.172158 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx"] May 11 21:13:56.172666 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.172503 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32717441-44d7-49e9-a189-9d38dbf3bd7a" containerName="manager" May 11 21:13:56.172666 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.172515 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="32717441-44d7-49e9-a189-9d38dbf3bd7a" containerName="manager" May 11 21:13:56.172666 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.172575 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="32717441-44d7-49e9-a189-9d38dbf3bd7a" containerName="manager" May 11 21:13:56.175436 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.175420 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.178224 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.178202 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-mb652\"" May 11 21:13:56.187610 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.187588 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx"] May 11 21:13:56.256329 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.256301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/83ddadaf-1312-4a50-bac7-6192387a071b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pxtpx\" (UID: \"83ddadaf-1312-4a50-bac7-6192387a071b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.256466 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.256359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvtqp\" (UniqueName: \"kubernetes.io/projected/83ddadaf-1312-4a50-bac7-6192387a071b-kube-api-access-lvtqp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pxtpx\" (UID: \"83ddadaf-1312-4a50-bac7-6192387a071b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.357616 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.357582 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/83ddadaf-1312-4a50-bac7-6192387a071b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pxtpx\" (UID: \"83ddadaf-1312-4a50-bac7-6192387a071b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.357763 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.357630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvtqp\" (UniqueName: \"kubernetes.io/projected/83ddadaf-1312-4a50-bac7-6192387a071b-kube-api-access-lvtqp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pxtpx\" (UID: \"83ddadaf-1312-4a50-bac7-6192387a071b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.357939 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.357920 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/83ddadaf-1312-4a50-bac7-6192387a071b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pxtpx\" (UID: \"83ddadaf-1312-4a50-bac7-6192387a071b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.367338 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.367310 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvtqp\" (UniqueName: \"kubernetes.io/projected/83ddadaf-1312-4a50-bac7-6192387a071b-kube-api-access-lvtqp\") pod \"kuadrant-operator-controller-manager-55c7f4c975-pxtpx\" (UID: \"83ddadaf-1312-4a50-bac7-6192387a071b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.486323 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.486263 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:56.627704 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.627682 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx"] May 11 21:13:56.632756 ip-10-0-128-58 kubenswrapper[2567]: W0511 21:13:56.632728 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ddadaf_1312_4a50_bac7_6192387a071b.slice/crio-a3b47e3bbc68c9bdb1480bde1974712d678f4dd02d904462e35d63def9bd61fe WatchSource:0}: Error finding container a3b47e3bbc68c9bdb1480bde1974712d678f4dd02d904462e35d63def9bd61fe: Status 404 returned error can't find the container with id a3b47e3bbc68c9bdb1480bde1974712d678f4dd02d904462e35d63def9bd61fe May 11 21:13:56.635295 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:56.635275 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 21:13:57.593872 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:57.593831 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" event={"ID":"83ddadaf-1312-4a50-bac7-6192387a071b","Type":"ContainerStarted","Data":"cb7de9f94a7303ae271567d8f02c4b7ce573c8b8db0578b9dc63d2f4081d231d"} May 11 21:13:57.593872 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:57.593875 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" event={"ID":"83ddadaf-1312-4a50-bac7-6192387a071b","Type":"ContainerStarted","Data":"a3b47e3bbc68c9bdb1480bde1974712d678f4dd02d904462e35d63def9bd61fe"} May 11 21:13:57.594356 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:57.594006 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:13:57.627826 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:13:57.627781 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" podStartSLOduration=1.627769741 podStartE2EDuration="1.627769741s" podCreationTimestamp="2026-05-11 21:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 21:13:57.624771532 +0000 UTC m=+1415.734551976" watchObservedRunningTime="2026-05-11 21:13:57.627769741 +0000 UTC m=+1415.737550186" May 11 21:14:08.600421 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:14:08.600390 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-pxtpx" May 11 21:14:16.092745 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:14:16.092710 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:14:20.887373 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:14:20.887337 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:14:47.292250 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:14:47.292210 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:14:53.586247 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:14:53.586216 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:15:00.142051 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.142012 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29642235-s6vsx"] May 11 21:15:00.145558 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.145535 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" May 11 21:15:00.148039 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.148019 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-s4nmf\"" May 11 21:15:00.152204 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.152183 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29642235-s6vsx"] May 11 21:15:00.244946 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.244914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7589\" (UniqueName: \"kubernetes.io/projected/9f8d81fd-d5d9-4a97-bf43-69501f3421f5-kube-api-access-p7589\") pod \"maas-api-key-cleanup-29642235-s6vsx\" (UID: \"9f8d81fd-d5d9-4a97-bf43-69501f3421f5\") " pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" May 11 21:15:00.345698 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.345659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7589\" (UniqueName: \"kubernetes.io/projected/9f8d81fd-d5d9-4a97-bf43-69501f3421f5-kube-api-access-p7589\") pod \"maas-api-key-cleanup-29642235-s6vsx\" (UID: \"9f8d81fd-d5d9-4a97-bf43-69501f3421f5\") " pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" May 11 21:15:00.354221 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.354195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7589\" (UniqueName: \"kubernetes.io/projected/9f8d81fd-d5d9-4a97-bf43-69501f3421f5-kube-api-access-p7589\") pod \"maas-api-key-cleanup-29642235-s6vsx\" (UID: \"9f8d81fd-d5d9-4a97-bf43-69501f3421f5\") " pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" May 11 21:15:00.457140 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.457061 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" May 11 21:15:00.590040 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.589933 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29642235-s6vsx"] May 11 21:15:00.594148 ip-10-0-128-58 kubenswrapper[2567]: W0511 21:15:00.594107 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8d81fd_d5d9_4a97_bf43_69501f3421f5.slice/crio-ed4073c04b28de796163f9cd1928e0bc3681f8ad147c8086fe11bc941d1e0f47 WatchSource:0}: Error finding container ed4073c04b28de796163f9cd1928e0bc3681f8ad147c8086fe11bc941d1e0f47: Status 404 returned error can't find the container with id ed4073c04b28de796163f9cd1928e0bc3681f8ad147c8086fe11bc941d1e0f47 May 11 21:15:00.826439 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:00.826399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" event={"ID":"9f8d81fd-d5d9-4a97-bf43-69501f3421f5","Type":"ContainerStarted","Data":"ed4073c04b28de796163f9cd1928e0bc3681f8ad147c8086fe11bc941d1e0f47"} May 11 21:15:04.186529 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:04.186488 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:15:05.846077 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:05.846047 2567 generic.go:358] "Generic (PLEG): container finished" podID="9f8d81fd-d5d9-4a97-bf43-69501f3421f5" containerID="f230eabc0eda97b0b0b83195fc6619a46be1fd224a23e3332f1ddc45cad26ac4" exitCode=0 May 11 21:15:05.846452 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:05.846084 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" event={"ID":"9f8d81fd-d5d9-4a97-bf43-69501f3421f5","Type":"ContainerDied","Data":"f230eabc0eda97b0b0b83195fc6619a46be1fd224a23e3332f1ddc45cad26ac4"} May 11 21:15:06.984480 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:06.984456 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" May 11 21:15:07.004446 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:07.002919 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7589\" (UniqueName: \"kubernetes.io/projected/9f8d81fd-d5d9-4a97-bf43-69501f3421f5-kube-api-access-p7589\") pod \"9f8d81fd-d5d9-4a97-bf43-69501f3421f5\" (UID: \"9f8d81fd-d5d9-4a97-bf43-69501f3421f5\") " May 11 21:15:07.010218 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:07.010189 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8d81fd-d5d9-4a97-bf43-69501f3421f5-kube-api-access-p7589" (OuterVolumeSpecName: "kube-api-access-p7589") pod "9f8d81fd-d5d9-4a97-bf43-69501f3421f5" (UID: "9f8d81fd-d5d9-4a97-bf43-69501f3421f5"). InnerVolumeSpecName "kube-api-access-p7589". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 21:15:07.104720 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:07.104640 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7589\" (UniqueName: \"kubernetes.io/projected/9f8d81fd-d5d9-4a97-bf43-69501f3421f5-kube-api-access-p7589\") on node \"ip-10-0-128-58.ec2.internal\" DevicePath \"\"" May 11 21:15:07.854884 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:07.854855 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" May 11 21:15:07.855081 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:07.854854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29642235-s6vsx" event={"ID":"9f8d81fd-d5d9-4a97-bf43-69501f3421f5","Type":"ContainerDied","Data":"ed4073c04b28de796163f9cd1928e0bc3681f8ad147c8086fe11bc941d1e0f47"} May 11 21:15:07.855081 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:07.854974 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed4073c04b28de796163f9cd1928e0bc3681f8ad147c8086fe11bc941d1e0f47" May 11 21:15:13.380549 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:13.380510 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:15:22.286742 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:22.286704 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:15:22.528151 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:22.528123 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:15:22.534299 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:22.534274 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:15:32.677383 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:32.677310 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:15:42.197612 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:42.197576 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:15:52.487321 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:15:52.487284 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:16:01.695329 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:16:01.695295 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:16:10.886952 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:16:10.886914 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:16:20.387834 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:16:20.387799 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:16:53.985324 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:16:53.985288 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:17:35.991205 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:17:35.991173 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:17:44.388001 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:17:44.387953 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:17:52.896885 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:17:52.896853 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:18:01.880460 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:18:01.880427 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:18:11.085877 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:18:11.085845 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:18:23.582501 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:18:23.582460 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:18:32.486075 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:18:32.485993 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:18:40.382826 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:18:40.382788 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:18:48.379518 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:18:48.379484 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:18:57.391670 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:18:57.391635 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:19:05.389440 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:19:05.389403 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:19:15.485138 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:19:15.485106 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:19:32.986878 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:19:32.986843 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:19:43.586068 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:19:43.586034 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:19:52.393971 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:19:52.393925 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:20:00.680532 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:00.680495 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:20:18.446508 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:18.446473 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:20:22.559891 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:22.559865 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:20:22.565554 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:22.565534 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:20:26.187077 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:26.187042 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:20:35.385604 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:35.385571 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:20:43.787884 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:43.787847 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:20:52.487409 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:20:52.487295 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:21:00.788255 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:21:00.788218 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:21:09.896228 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:21:09.896182 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:21:23.084671 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:21:23.084636 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:21:32.381588 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:21:32.381507 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:21:45.588153 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:21:45.588119 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:21:54.588471 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:21:54.588439 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:22:02.789366 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:22:02.789336 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:22:10.388671 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:22:10.388636 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:22:19.192105 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:22:19.192072 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:22:35.476926 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:22:35.476890 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:22:43.184910 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:22:43.184873 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:22:52.182571 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:22:52.182538 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:23:01.485381 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:01.485300 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:23:24.080113 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:24.080080 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:23:36.694916 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:36.694885 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jck9s"] May 11 21:23:43.204039 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:43.204002 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-nq9kg_ad358e12-1a73-4326-9235-915bfb8847bf/manager/0.log" May 11 21:23:43.431173 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:43.431146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29642235-s6vsx_9f8d81fd-d5d9-4a97-bf43-69501f3421f5/cleanup/0.log" May 11 21:23:43.658397 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:43.658372 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7z6xc_e652aca3-bd36-4907-9e16-6be17cde2c16/manager/2.log" May 11 21:23:43.766772 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:43.766745 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-755c95f69f-7j9jk_39680580-0555-4d40-978a-ae556647366e/manager/0.log" May 11 21:23:44.818926 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:44.818892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw_0422a798-ac83-4b60-bfee-01964e487d82/pull/0.log" May 11 21:23:44.825170 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:44.825146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw_0422a798-ac83-4b60-bfee-01964e487d82/extract/0.log" May 11 21:23:44.830871 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:44.830853 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw_0422a798-ac83-4b60-bfee-01964e487d82/util/0.log" May 11 21:23:44.934130 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:44.934104 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf_53246992-50b7-4b5a-96c4-a77728a0a4c9/util/0.log" May 11 21:23:44.939642 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:44.939622 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf_53246992-50b7-4b5a-96c4-a77728a0a4c9/pull/0.log" May 11 21:23:44.944901 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:44.944883 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf_53246992-50b7-4b5a-96c4-a77728a0a4c9/extract/0.log" May 11 21:23:45.050036 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.050015 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9_6301480a-ae94-4125-9ba1-97b9d055b32c/extract/0.log" May 11 21:23:45.055125 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.055108 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9_6301480a-ae94-4125-9ba1-97b9d055b32c/util/0.log" May 11 21:23:45.060803 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.060784 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9_6301480a-ae94-4125-9ba1-97b9d055b32c/pull/0.log" May 11 21:23:45.164168 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.164105 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd_0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0/util/0.log" May 11 21:23:45.169955 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.169921 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd_0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0/pull/0.log" May 11 21:23:45.175422 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.175384 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd_0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0/extract/0.log" May 11 21:23:45.405676 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.405644 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-fr92f_6310b47c-4fb1-4384-811f-43c2fee5c54e/manager/0.log" May 11 21:23:45.739050 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.739023 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-fbhxs_8c3c8e91-d4a1-4e93-a78d-1b52efbc3163/registry-server/0.log" May 11 21:23:45.863090 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.863057 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-pxtpx_83ddadaf-1312-4a50-bac7-6192387a071b/manager/0.log" May 11 21:23:45.975457 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:45.975428 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-jck9s_5a8aadf5-8168-4a83-95ef-b1d014ffe034/limitador/0.log" May 11 21:23:46.530740 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:46.530711 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-798958bb55-2c69v_e4b31fbc-0442-401c-b32b-ba68a2bf000b/discovery/0.log" May 11 21:23:46.636953 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:46.636928 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-c8c9857f9-bjljt_2b64cae5-bcbe-4b04-952c-28a536d1e35b/kube-auth-proxy/0.log" May 11 21:23:52.148020 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.147983 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qcz2k/must-gather-55lq2"] May 11 21:23:52.148470 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.148451 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f8d81fd-d5d9-4a97-bf43-69501f3421f5" containerName="cleanup" May 11 21:23:52.148470 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.148470 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8d81fd-d5d9-4a97-bf43-69501f3421f5" containerName="cleanup" May 11 21:23:52.148586 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.148571 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f8d81fd-d5d9-4a97-bf43-69501f3421f5" containerName="cleanup" May 11 21:23:52.151754 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.151731 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.154752 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.154717 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qcz2k\"/\"kube-root-ca.crt\"" May 11 21:23:52.154886 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.154783 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qcz2k\"/\"openshift-service-ca.crt\"" May 11 21:23:52.155811 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.155792 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qcz2k\"/\"default-dockercfg-4h4cq\"" May 11 21:23:52.169813 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.169792 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/must-gather-55lq2"] May 11 21:23:52.264978 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.264932 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7w9m\" (UniqueName: \"kubernetes.io/projected/b925708e-c12f-4a3b-8d69-68397e88e9ed-kube-api-access-s7w9m\") pod \"must-gather-55lq2\" (UID: \"b925708e-c12f-4a3b-8d69-68397e88e9ed\") " pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.265092 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.265036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b925708e-c12f-4a3b-8d69-68397e88e9ed-must-gather-output\") pod \"must-gather-55lq2\" (UID: \"b925708e-c12f-4a3b-8d69-68397e88e9ed\") " pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.366179 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.366145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b925708e-c12f-4a3b-8d69-68397e88e9ed-must-gather-output\") pod \"must-gather-55lq2\" (UID: \"b925708e-c12f-4a3b-8d69-68397e88e9ed\") " pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.366322 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.366231 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7w9m\" (UniqueName: \"kubernetes.io/projected/b925708e-c12f-4a3b-8d69-68397e88e9ed-kube-api-access-s7w9m\") pod \"must-gather-55lq2\" (UID: \"b925708e-c12f-4a3b-8d69-68397e88e9ed\") " pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.366484 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.366450 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b925708e-c12f-4a3b-8d69-68397e88e9ed-must-gather-output\") pod \"must-gather-55lq2\" (UID: \"b925708e-c12f-4a3b-8d69-68397e88e9ed\") " pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.378056 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.378034 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7w9m\" (UniqueName: \"kubernetes.io/projected/b925708e-c12f-4a3b-8d69-68397e88e9ed-kube-api-access-s7w9m\") pod \"must-gather-55lq2\" (UID: \"b925708e-c12f-4a3b-8d69-68397e88e9ed\") " pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.460851 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.460767 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/must-gather-55lq2" May 11 21:23:52.590878 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.590854 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/must-gather-55lq2"] May 11 21:23:52.592521 ip-10-0-128-58 kubenswrapper[2567]: W0511 21:23:52.592498 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb925708e_c12f_4a3b_8d69_68397e88e9ed.slice/crio-114ab91e96b2e1ccef78cd402f1409b12afa6e24cb006d8ae3a13d191613f363 WatchSource:0}: Error finding container 114ab91e96b2e1ccef78cd402f1409b12afa6e24cb006d8ae3a13d191613f363: Status 404 returned error can't find the container with id 114ab91e96b2e1ccef78cd402f1409b12afa6e24cb006d8ae3a13d191613f363 May 11 21:23:52.594587 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.594569 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 21:23:52.787424 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:52.787391 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/must-gather-55lq2" event={"ID":"b925708e-c12f-4a3b-8d69-68397e88e9ed","Type":"ContainerStarted","Data":"114ab91e96b2e1ccef78cd402f1409b12afa6e24cb006d8ae3a13d191613f363"} May 11 21:23:58.822451 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:58.822408 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/must-gather-55lq2" event={"ID":"b925708e-c12f-4a3b-8d69-68397e88e9ed","Type":"ContainerStarted","Data":"663a32141e43eb9f1405486ade4416bb359804cccba95d11cbb8dfc6438fb206"} May 11 21:23:58.822986 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:58.822459 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/must-gather-55lq2" event={"ID":"b925708e-c12f-4a3b-8d69-68397e88e9ed","Type":"ContainerStarted","Data":"ac855362521166c0be1eb9116afee6ae985d17b341317202c0afa816cd7120a7"} May 11 21:23:58.842212 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:23:58.842150 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qcz2k/must-gather-55lq2" podStartSLOduration=1.012922038 podStartE2EDuration="6.842131109s" podCreationTimestamp="2026-05-11 21:23:52 +0000 UTC" firstStartedPulling="2026-05-11 21:23:52.594720865 +0000 UTC m=+2010.704501288" lastFinishedPulling="2026-05-11 21:23:58.423929933 +0000 UTC m=+2016.533710359" observedRunningTime="2026-05-11 21:23:58.837698538 +0000 UTC m=+2016.947478984" watchObservedRunningTime="2026-05-11 21:23:58.842131109 +0000 UTC m=+2016.951911569" May 11 21:24:00.036683 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:00.036653 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-l8f8d_f1b8e168-86e5-4ad6-b105-311a1c00b2ea/global-pull-secret-syncer/0.log" May 11 21:24:00.138894 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:00.138849 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-k98kb_dc7028e1-034b-4393-88d2-1dbb1e82cfe7/konnectivity-agent/0.log" May 11 21:24:00.158607 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:00.158579 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-58.ec2.internal_2c5eff5275aeb9307128a4ad3171d6f0/haproxy/0.log" May 11 21:24:03.931724 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:03.931699 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw_0422a798-ac83-4b60-bfee-01964e487d82/extract/0.log" May 11 21:24:03.970701 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:03.970676 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw_0422a798-ac83-4b60-bfee-01964e487d82/util/0.log" May 11 21:24:04.010047 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.009971 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j7jgw_0422a798-ac83-4b60-bfee-01964e487d82/pull/0.log" May 11 21:24:04.055657 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.055628 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf_53246992-50b7-4b5a-96c4-a77728a0a4c9/extract/0.log" May 11 21:24:04.095840 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.095785 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf_53246992-50b7-4b5a-96c4-a77728a0a4c9/util/0.log" May 11 21:24:04.127225 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.127197 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nfkxf_53246992-50b7-4b5a-96c4-a77728a0a4c9/pull/0.log" May 11 21:24:04.171849 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.171816 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9_6301480a-ae94-4125-9ba1-97b9d055b32c/extract/0.log" May 11 21:24:04.211602 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.211510 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9_6301480a-ae94-4125-9ba1-97b9d055b32c/util/0.log" May 11 21:24:04.254849 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.254820 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328zb9_6301480a-ae94-4125-9ba1-97b9d055b32c/pull/0.log" May 11 21:24:04.295819 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.295788 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd_0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0/extract/0.log" May 11 21:24:04.341632 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.341605 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd_0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0/util/0.log" May 11 21:24:04.383308 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.383277 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17rrmd_0b3c4ebb-0b4d-4b16-9a54-befadf67cfa0/pull/0.log" May 11 21:24:04.677563 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.677533 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-fr92f_6310b47c-4fb1-4384-811f-43c2fee5c54e/manager/0.log" May 11 21:24:04.797345 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.797312 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-fbhxs_8c3c8e91-d4a1-4e93-a78d-1b52efbc3163/registry-server/0.log" May 11 21:24:04.885641 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.885610 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-pxtpx_83ddadaf-1312-4a50-bac7-6192387a071b/manager/0.log" May 11 21:24:04.907312 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:04.907285 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-jck9s_5a8aadf5-8168-4a83-95ef-b1d014ffe034/limitador/0.log" May 11 21:24:06.757037 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:06.757009 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9ckmx_6f84d77f-8342-4c8f-9a13-c7c909d327d3/node-exporter/0.log" May 11 21:24:06.786243 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:06.786221 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9ckmx_6f84d77f-8342-4c8f-9a13-c7c909d327d3/kube-rbac-proxy/0.log" May 11 21:24:06.814352 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:06.814324 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9ckmx_6f84d77f-8342-4c8f-9a13-c7c909d327d3/init-textfile/0.log" May 11 21:24:08.752178 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.752142 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg"] May 11 21:24:08.759286 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.759261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.765001 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.764931 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg"] May 11 21:24:08.824445 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.824407 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-proc\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.824626 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.824502 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-podres\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.824626 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.824549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-lib-modules\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.824626 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.824576 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2255\" (UniqueName: \"kubernetes.io/projected/86eb7f8a-2456-4316-828d-b149a806993f-kube-api-access-k2255\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.824626 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.824600 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-sys\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.925940 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.925905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-proc\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926136 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.925952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-podres\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926136 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.926010 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-lib-modules\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926136 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.926037 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2255\" (UniqueName: \"kubernetes.io/projected/86eb7f8a-2456-4316-828d-b149a806993f-kube-api-access-k2255\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926136 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.926065 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-proc\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926136 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.926066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-sys\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926136 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.926106 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-sys\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926430 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.926196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-podres\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.926430 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.926219 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86eb7f8a-2456-4316-828d-b149a806993f-lib-modules\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:08.945204 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:08.945177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2255\" (UniqueName: \"kubernetes.io/projected/86eb7f8a-2456-4316-828d-b149a806993f-kube-api-access-k2255\") pod \"perf-node-gather-daemonset-bdgqg\" (UID: \"86eb7f8a-2456-4316-828d-b149a806993f\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:09.072381 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:09.072352 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:09.246890 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:09.246835 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg"] May 11 21:24:09.256285 ip-10-0-128-58 kubenswrapper[2567]: W0511 21:24:09.254408 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod86eb7f8a_2456_4316_828d_b149a806993f.slice/crio-112f201c5c10c5fb9231f7fb70b1e6c245d022406f25a17fe88b3607e30d5943 WatchSource:0}: Error finding container 112f201c5c10c5fb9231f7fb70b1e6c245d022406f25a17fe88b3607e30d5943: Status 404 returned error can't find the container with id 112f201c5c10c5fb9231f7fb70b1e6c245d022406f25a17fe88b3607e30d5943 May 11 21:24:09.645397 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:09.645308 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d5898bbc6-cltd4_4c23fe17-11fe-4f20-af40-7bde5e75d825/console/0.log" May 11 21:24:09.682166 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:09.682134 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-c8c4474bc-4m7jz_97d80799-b6a0-4fad-bb15-1b33a216aa97/download-server/0.log" May 11 21:24:09.879379 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:09.879338 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" event={"ID":"86eb7f8a-2456-4316-828d-b149a806993f","Type":"ContainerStarted","Data":"ed3f4b8a9bd0f180910533434c4c17b6b4e9da1a469c5ef20906595c7abe7e8a"} May 11 21:24:09.879379 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:09.879376 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" event={"ID":"86eb7f8a-2456-4316-828d-b149a806993f","Type":"ContainerStarted","Data":"112f201c5c10c5fb9231f7fb70b1e6c245d022406f25a17fe88b3607e30d5943"} May 11 21:24:09.895852 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:09.895748 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" podStartSLOduration=1.8957315430000001 podStartE2EDuration="1.895731543s" podCreationTimestamp="2026-05-11 21:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 21:24:09.894280685 +0000 UTC m=+2028.004061133" watchObservedRunningTime="2026-05-11 21:24:09.895731543 +0000 UTC m=+2028.005512039" May 11 21:24:10.885338 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:10.885309 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:10.957119 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:10.957085 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6l7gb_c72e4a76-101f-44bb-abd9-0c5f9b123dfc/dns/0.log" May 11 21:24:10.978818 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:10.978796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6l7gb_c72e4a76-101f-44bb-abd9-0c5f9b123dfc/kube-rbac-proxy/0.log" May 11 21:24:11.112530 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:11.112501 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wdhpf_8981c6f1-07ce-4ebe-9071-6caf7218306a/dns-node-resolver/0.log" May 11 21:24:11.577175 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:11.577141 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-76d66cd8bd-tfgnz_1ed91ff5-f4c6-422d-b993-1c73812e6c81/registry/0.log" May 11 21:24:11.596637 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:11.596611 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cv7k6_dea6f3bb-bb99-4e25-8cf5-1aca4ea1ed96/node-ca/0.log" May 11 21:24:12.597261 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:12.597234 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-798958bb55-2c69v_e4b31fbc-0442-401c-b32b-ba68a2bf000b/discovery/0.log" May 11 21:24:12.615360 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:12.615335 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-c8c9857f9-bjljt_2b64cae5-bcbe-4b04-952c-28a536d1e35b/kube-auth-proxy/0.log" May 11 21:24:13.302804 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:13.302770 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r2pks_979c2460-155e-4ca9-97a7-69b6b59a3dcb/serve-healthcheck-canary/0.log" May 11 21:24:13.746804 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:13.746727 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9cp9w_1a279daf-6a19-4521-916d-10598e20c36a/kube-rbac-proxy/0.log" May 11 21:24:13.765358 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:13.765336 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9cp9w_1a279daf-6a19-4521-916d-10598e20c36a/exporter/0.log" May 11 21:24:13.784227 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:13.784207 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9cp9w_1a279daf-6a19-4521-916d-10598e20c36a/extractor/0.log" May 11 21:24:15.801047 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:15.801010 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-nq9kg_ad358e12-1a73-4326-9235-915bfb8847bf/manager/0.log" May 11 21:24:15.845524 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:15.845487 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29642235-s6vsx_9f8d81fd-d5d9-4a97-bf43-69501f3421f5/cleanup/0.log" May 11 21:24:15.932114 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:15.932083 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7z6xc_e652aca3-bd36-4907-9e16-6be17cde2c16/manager/1.log" May 11 21:24:15.956190 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:15.956151 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7z6xc_e652aca3-bd36-4907-9e16-6be17cde2c16/manager/2.log" May 11 21:24:15.976948 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:15.976920 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-755c95f69f-7j9jk_39680580-0555-4d40-978a-ae556647366e/manager/0.log" May 11 21:24:16.898491 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:16.898460 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-bdgqg" May 11 21:24:17.216066 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:17.215989 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-2mqhs_b2ea3e22-2203-48e1-adb7-4a8f99a4a05a/openshift-lws-operator/0.log" May 11 21:24:21.489744 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:21.489660 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-5f598d4645-pnbh9_b6c1bfc8-86ab-47ad-ac81-740207bf8f06/migrator/0.log" May 11 21:24:21.509629 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:21.509603 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-5f598d4645-pnbh9_b6c1bfc8-86ab-47ad-ac81-740207bf8f06/graceful-termination/0.log" May 11 21:24:22.779427 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.779395 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5kw85_b239754b-8d38-41b0-9290-744afb39226a/kube-multus/0.log" May 11 21:24:22.800339 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.800319 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8r5d_3eb3a067-139c-450e-b053-3f1a84abc363/kube-multus-additional-cni-plugins/0.log" May 11 21:24:22.819568 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.819548 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8r5d_3eb3a067-139c-450e-b053-3f1a84abc363/egress-router-binary-copy/0.log" May 11 21:24:22.840523 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.840487 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8r5d_3eb3a067-139c-450e-b053-3f1a84abc363/cni-plugins/0.log" May 11 21:24:22.866560 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.866542 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8r5d_3eb3a067-139c-450e-b053-3f1a84abc363/bond-cni-plugin/0.log" May 11 21:24:22.893301 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.893281 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8r5d_3eb3a067-139c-450e-b053-3f1a84abc363/routeoverride-cni/0.log" May 11 21:24:22.913589 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.913568 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8r5d_3eb3a067-139c-450e-b053-3f1a84abc363/whereabouts-cni-bincopy/0.log" May 11 21:24:22.935896 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:22.935872 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l8r5d_3eb3a067-139c-450e-b053-3f1a84abc363/whereabouts-cni/0.log" May 11 21:24:23.369239 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:23.369206 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v9s7z_3800edc1-af00-418d-a5b8-d832cbe20fbf/network-metrics-daemon/0.log" May 11 21:24:23.385051 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:23.385022 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v9s7z_3800edc1-af00-418d-a5b8-d832cbe20fbf/kube-rbac-proxy/0.log" May 11 21:24:24.694706 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.694675 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-controller/0.log" May 11 21:24:24.709683 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.709658 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/0.log" May 11 21:24:24.729083 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.729055 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovn-acl-logging/1.log" May 11 21:24:24.746183 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.746158 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/kube-rbac-proxy-node/0.log" May 11 21:24:24.765939 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.765917 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/kube-rbac-proxy-ovn-metrics/0.log" May 11 21:24:24.786892 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.786868 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/northd/0.log" May 11 21:24:24.806532 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.806516 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/nbdb/0.log" May 11 21:24:24.826867 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:24.826846 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/sbdb/0.log" May 11 21:24:25.025832 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:25.025756 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-svtmh_3d7be993-4ba8-4b01-8fd3-d04162534cc5/ovnkube-controller/0.log" May 11 21:24:26.083735 ip-10-0-128-58 kubenswrapper[2567]: I0511 21:24:26.083706 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-m9tgf_3a2d13ea-d235-437e-9668-e21aca93682a/network-check-target-container/0.log"